2018
DOI: 10.1007/s00440-018-0830-4
|View full text |Cite
|
Sign up to set email alerts
|

The spectral norm of random inner-product kernel matrices

Abstract: We study an "inner-product kernel" random matrix model, whose empirical spectral distribution was shown by Xiuyuan Cheng and Amit Singer to converge to a deterministic measure in the large n and p limit. We provide an interpretation of this limit measure as the additive free convolution of a semicircle law and a Marcenko-Pastur law. By comparing the tracial moments of this random matrix to those of a deformed GUE matrix with the same limiting spectrum, we establish that for odd kernel functions, the spectral n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
28
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 33 publications
(29 citation statements)
references
References 59 publications
(133 reference statements)
1
28
0
Order By: Relevance
“…Techniques from random matrix theory have been adapted to this new class of random matrices. In particular, the leave-one-out method can be used to derive a recursion for the resolvent, as first shown for this type of matrices in Cheng and Singer (2013), and the moments method was first used in Fan and Montanari (2019) (both of these papers consider symmetric random matrices, but these techniques extend to the asymmetric case). Further results on kernel random matrices can be found in Do and Vu (2013), Louart, Liao and Couillet (2018) and Pennington and Worah (2018).…”
Section: Proportional Scalingmentioning
confidence: 99%
See 1 more Smart Citation
“…Techniques from random matrix theory have been adapted to this new class of random matrices. In particular, the leave-one-out method can be used to derive a recursion for the resolvent, as first shown for this type of matrices in Cheng and Singer (2013), and the moments method was first used in Fan and Montanari (2019) (both of these papers consider symmetric random matrices, but these techniques extend to the asymmetric case). Further results on kernel random matrices can be found in Do and Vu (2013), Louart, Liao and Couillet (2018) and Pennington and Worah (2018).…”
Section: Proportional Scalingmentioning
confidence: 99%
“…The universality phenomenon of Theorem 6.2 first emerged in random matrix theory studies of (symmetric) kernel inner product random matrices. In that case, the spectrum of such a random matrix was shown in Cheng and Singer (2013) to behave asymptotically as the one of the sum of independent Wishart and Wigner matrices, which correspond respectively to the linear and nonlinear parts of the kernel (see also Fan and Montanari 2019, where this remark is made more explicit). In the context of random features ridge regression, this type of universality was first pointed out in Hastie et al (2019), which proved a special case of Theorem 6.2.…”
Section: : 151mentioning
confidence: 99%
“…We defer the study of the asymptotic behavior of the largest eigenvalue as in [15] to another article.…”
Section: Model and Resultsmentioning
confidence: 99%
“…of nonlinear random matrix models of the form f (XX * ) have also been studied in [14] and [8] with different variance scalings for the entries of X. We also mention [15] where the question is further studied including the behavior of extreme eigenvalues, with a view to statistical estimation of the covariance of the population.…”
Section: Introductionmentioning
confidence: 99%
“…In summary, under the manifold model, researchers show that the Graph Laplacian(GL) converges to the Laplace-Beltrami operator in various settings with properly chosen bandwidth. On other hand, the spectral properties have been investigated in [3,4,8,13,14,17,28] under the null setup. These works essentially show that when X contains pure high-dimensional noise, the affinity and transition matrices are governed by a low-rank perturbed Gram matrix when the bandwidth h 1 = p 1 .…”
Section: Introductionmentioning
confidence: 99%