2022
DOI: 10.48550/arxiv.2205.08532
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

New Lower Bounds for Private Estimation and a Generalized Fingerprinting Lemma

Abstract: We prove new lower bounds for statistical estimation tasks under the constraint of pε, δqdifferential privacy. First, we provide tight lower bounds for private covariance estimation of Gaussian distributions. We show that estimating the covariance matrix in Frobenius norm requires Ω `d2 ˘samples, and in spectral norm requires Ω ´d 3 2 ¯samples, both matching upper bounds up to logarithmic factors. We prove these bounds via our main technical contribution, a broad generalization of the fingerprinting method [BU… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 11 publications
0
5
0
Order By: Relevance
“…Our lower bounds are loose in its dependence in log(1/δ). Recently, a promising lower bound technique has been introduced in [45] that might close this gap.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Our lower bounds are loose in its dependence in log(1/δ). Recently, a promising lower bound technique has been introduced in [45] that might close this gap.…”
Section: Discussionmentioning
confidence: 99%
“…The next lower bound shows that without such assumptions on the tail, the error due to privacy scales as Ω( d ∧ log(1/δ)/(εn)). We believe that the dependence in δ is loose, and it might be possible to get a tighter lower bound using [45]. We provide a proof and other lower bounds in Appendix C.…”
Section: Differentially Private Principal Component Analysis (Dp-pca)mentioning
confidence: 99%
“…However, the sample complexity of these methods depends on the condition number of the covariance matrix, and requires a priori bounds on the range of the parameters. The first finite sample complexity bound for privately learning unbounded Gaussians appeared in [AAK21], nearly matching the sample complexity lower bound of [KMS22]. The work of [AAK21] relies on 2 They assume there are known quantities R, σmax, σ min such that ∀i ∈…”
Section: Related Workmentioning
confidence: 99%
“…d{nq (follows from the lower bound of mean estimation [KLSU19,BS16]) and an upper bound Õpd{nq (attained by the Gaussian mechanism). Very recently, Kamath et al [KMS22] improve the lower bound to r Ωpd{nq for d ă Op ? nq 2 , which means that the complexity of covariance estimation is same as d 2 -dimensional mean estimation in the low-dimensional regime, so one cannot hope to beat the Gaussian mechanism for small d. However, in the high-dimensional regime, our result indicates that the problem is strictly easier, due to the correlations of the d 2 entries of Σ.…”
Section: Worst-case Boundsmentioning
confidence: 99%
“…A Extend the lower bound from statistical setting to empirical setting [KMS22] considers the statistical setting where each X i " N p0, Σ D q with Ωp1q ¨I ď Σ D ď Op1q ¨I independently. In this case, we have }X i } 2 " Θp ?…”
Section: B Approximate Dp Error Bound For Emcovmentioning
confidence: 99%