2019
DOI: 10.48550/arxiv.1901.05948
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Tail bounds for gaps between eigenvalues of sparse random matrices

Abstract: We prove the first eigenvalue repulsion bound for sparse random matrices. As a consequence, we show that these matrices have simple spectrum, improving the range of sparsity and error probability from work of the second author and Vu. We also show that for sparse Erdős-Rényi graphs, weak and strong nodal domains are the same, answering a question of Dekel, Lee, and Linial.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
23
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(23 citation statements)
references
References 40 publications
(75 reference statements)
0
23
0
Order By: Relevance
“…We note that since its first appearance in [25], the RLCD has been used in many works (see, e.g., [13,14,16,19,26]); the MRLCD (and median threshold, for discrete distributions) can replace these applications in a black-box manner, and likely lead to improved quantitative estimates. We also note that a related use of combinatorially incorporating arithmetic unstructure of different projections of a vector appeared in recent work of the authors [8]; however, the interaction with both the net and anticoncentration estimates is more delicate here.…”
Section: Introductionmentioning
confidence: 99%
“…We note that since its first appearance in [25], the RLCD has been used in many works (see, e.g., [13,14,16,19,26]); the MRLCD (and median threshold, for discrete distributions) can replace these applications in a black-box manner, and likely lead to improved quantitative estimates. We also note that a related use of combinatorially incorporating arithmetic unstructure of different projections of a vector appeared in recent work of the authors [8]; however, the interaction with both the net and anticoncentration estimates is more delicate here.…”
Section: Introductionmentioning
confidence: 99%
“…Also, one can obtain tail-bounds for consecutive eigenvalue gaps of G(n, p), i.e. δ i = λ i+1 − λ i which in turn leads to a lower bound on ∆ min for ĀG(n,p) as ∆ min ≥ n −5/2+o(1) p −1/2 , almost surely [43,44]. This is the best known bound for this quantity for discrete random matrices.…”
Section: Mixing Of Quantum Walksmentioning
confidence: 90%
“…However as we examine gaps λ i+r − λ i for large enough r, rigidity provides a better estimate on the gap size than the tail bounds from Ref. [43]. Combining both estimates at the different scales of r yields an improved estimate for Σ r .…”
Section: Mixing Of Quantum Walksmentioning
confidence: 97%
See 2 more Smart Citations