2020
DOI: 10.48550/arxiv.2010.06554
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Singularity of discrete random matrices

Abstract: Let ξ be a non-constant real-valued random variable with finite support, and let Mn(ξ) denote an n × n random matrix with entries that are independent copies of ξ. We show that, if ξ is not uniform on its support, then P[Mn(ξ) is singular] = (1 + on(1))P[zero row or column, or two equal (up to sign) rows or columns].For ξ which is uniform on its support, we show that P[Mn(ξ) is singular] = (1 + on(1)) n P[two rows or columns are equal].Corresponding estimates on the least singular value are also provided.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
15
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 7 publications
(16 citation statements)
references
References 11 publications
1
15
0
Order By: Relevance
“…2. In the i.i.d random selection scheme, the probability that the participation matrix after rounds P ( ) has full rank is lower-bounded as follows [28] Pr[P ( ) has full rank] ≥ 1 − 2 (1 − ) − (1 + (1)) ( − 1) ( 2 + (1 − ) 2 ) , which converges to 1 exponentially fast if = / ∈ (0, 1/2) is a fixed constant. Hence, it follows that the probability the server can reconstruct all individual models is lower-bounded by the same probability.…”
Section: G1 Theoretical Analysis Of the Random Selection Strategiesmentioning
confidence: 99%
“…2. In the i.i.d random selection scheme, the probability that the participation matrix after rounds P ( ) has full rank is lower-bounded as follows [28] Pr[P ( ) has full rank] ≥ 1 − 2 (1 − ) − (1 + (1)) ( − 1) ( 2 + (1 − ) 2 ) , which converges to 1 exponentially fast if = / ∈ (0, 1/2) is a fixed constant. Hence, it follows that the probability the server can reconstruct all individual models is lower-bounded by the same probability.…”
Section: G1 Theoretical Analysis Of the Random Selection Strategiesmentioning
confidence: 99%
“…The proof of Proposition 2.7 follows the usual format of taking a dyadic decomposition of possible values for the threshold function, performing randomized rounding on potential kernel vectors at the correct scale, and then tensorizing the resulting small ball probabilities. The difference in the statement above compared to the versions in [8,9,17] is that we are missing k rows as opposed to 1 row. Additionally, we are considering the independent threshold model rather than the "multislice" models considered in [9], which actually simplifies the proof.…”
Section: Case IImentioning
confidence: 99%
“…Remark. A modification of our proof, with Proposition 2.7 replaced by the corresponding versions in [8,9], shows that for any fixed ξ which is supported on finitely many points,…”
Section: Introductionmentioning
confidence: 98%
See 1 more Smart Citation
“…Another major advance on the problem was made recently by Jain, Sah and Sawhney [17,18], who (building on the recent work of Litvak and Tikhomirov [26]), proved the natural analogue of (1) for random matrices with independent entries chosen from a finite set S, for any non-uniform distribution on S. For the case of {−1, 1}-matrices, however, they were unable to improve on the bound of Tikhomirov.…”
Section: Introductionmentioning
confidence: 99%