2020
DOI: 10.1080/01621459.2019.1700130
|View full text |Cite
|
Sign up to set email alerts
|

Privacy-Preserving Parametric Inference: A Case for Robust Statistics

Abstract: Differential privacy is a cryptographically-motivated approach to privacy that has become a very active field of research over the last decade in theoretical computer science and machine learning. In this paradigm one assumes there is a trusted curator who holds the data of individuals in a database and the goal of privacy is to simultaneously protect individual data while allowing the release of global characteristics of the database. In this setting we introduce a general framework for parametric inference w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
16
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 30 publications
(17 citation statements)
references
References 51 publications
(95 reference statements)
1
16
0
Order By: Relevance
“…. Then we find that actually ||W t || 2 2 /σ 2 follows a chi-square distribution χ 2 (d). By the concentration of chi-square distribution, there exists constants c 0 , c 1 , c 2 , such that:…”
Section: Proof Of Theorem 52mentioning
confidence: 86%
See 1 more Smart Citation
“…. Then we find that actually ||W t || 2 2 /σ 2 follows a chi-square distribution χ 2 (d). By the concentration of chi-square distribution, there exists constants c 0 , c 1 , c 2 , such that:…”
Section: Proof Of Theorem 52mentioning
confidence: 86%
“…The trade-off between statistical accuracy and privacy is one of the fundamental topics in differential privacy. In the low-dimensional setting, there are various works focusing on this trade-off, including mean estimation [28,52,8,34,10,36], confidence intervals of Gaussian mean [37] and binomial mean [4], linear regression [55,10], generalized linear models [51,11,50], principal component analysis [30,12], convex empirical risk minimization [7], and robust M-estimators [2,3].…”
Section: Introductionmentioning
confidence: 99%
“…Note that, in general, computing smooth sensitivity is also computationally inefficient with an exception of [AM21]. Using smooth sensitivity, [Lei11, Smi11, CH12, AM21] leverage robust Mestimators for differentially private estimation and inference.…”
Section: Testmentioning
confidence: 99%
“…For example, after defining the above robustness measures, Chaudhuri and Hsu (2012) use the notion of GES to deliver convergence rates for differentially private statistical estimators while Avella-Medina (2019) uses this measure to calibrate the additive noise to deliver differentially private M-Estimators. In the next sections we explore another approach, suggested but not studied in Chaudhuri and Hsu (2012) and Avella-Medina (2019), where we investigate the use of bounded M-Estimation for differentially private estimation and prediction using the OPM. While empirical risk minimization, that objective perturbation is built on, can be classified as M-Estimation, it is not straightforward to integrate the standard bounded functions for M-estimation as is.…”
Section: Links With Robust Statisticsmentioning
confidence: 99%