2023
DOI: 10.1017/apr.2023.3
|View full text |Cite
|
Sign up to set email alerts
|

-Stable convergence of heavy-/light-tailed infinitely wide neural networks

Abstract: We consider infinitely wide multi-layer perceptrons (MLPs) which are limits of standard deep feed-forward neural networks. We assume that, for each layer, the weights of an MLP are initialized with independent and identically distributed (i.i.d.) samples from either a light-tailed (finite-variance) or a heavy-tailed distribution in the domain of attraction of a symmetric $\alpha$ -stable distribution, where $\alpha\in(0,2]$ may depend on the layer. For… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
references
References 23 publications
(34 reference statements)
0
0
0
Order By: Relevance