2016
DOI: 10.1007/978-3-662-53641-4_23
|View full text |Cite
|
Sign up to set email alerts
|

Separating Computational and Statistical Differential Privacy in the Client-Server Model

Abstract: Abstract. Differential privacy is a mathematical definition of privacy for statistical data analysis. It guarantees that any (possibly adversarial) data analyst is unable to learn too much information that is specific to an individual. Mironov et al. (CRYPTO 2009) proposed several computational relaxations of differential privacy (CDP), which relax this guarantee to hold only against computationally bounded adversaries. Their work and subsequent work showed that CDP can yield substantial accuracy improvements… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 30 publications
(26 reference statements)
0
4
0
Order By: Relevance
“…The IND-CDP notion allows much more accurate functionalities in the two-party setting [12]; in the traditional client-server setup there is a natural class of functionalities where the gap between IND-CDP and (ǫ, δ)-DP is minimal [13], and there are (contrived) examples where the computational relaxation permits tasks that are infeasible under information-theoretic definitions [14].…”
Section: Differential Privacy and Its Flavorsmentioning
confidence: 99%
See 1 more Smart Citation
“…The IND-CDP notion allows much more accurate functionalities in the two-party setting [12]; in the traditional client-server setup there is a natural class of functionalities where the gap between IND-CDP and (ǫ, δ)-DP is minimal [13], and there are (contrived) examples where the computational relaxation permits tasks that are infeasible under information-theoretic definitions [14].…”
Section: Differential Privacy and Its Flavorsmentioning
confidence: 99%
“…Under the indistinguishability-based Computational Differential Privacy (IND-CDP) definition [3], the test of closeness between distributions on adjacent inputs is computationally bounded (all other definitions considered in this paper hold against an unbounded, information-theoretic adversary). The IND-CDP notion allows much more accurate functionalities in the two-party setting [12]; in the traditional client-server setup there is a natural class of functionalities where the gap between IND-CDP and (ǫ, δ)-DP is minimal [13], and there are (contrived) examples where the computational relaxation permits tasks that are infeasible under information-theoretic definitions [14].…”
Section: Definition 1 (ǫ-Dp) a Randomized Mechanismmentioning
confidence: 99%
“…Indeed, Groce, Katz, and Yerukhimovich [18] showed that a wide range of CDP mechanisms can be converted into an (information-theoretic) DP mechanism. Bun, Chen, and Vadhan [4] showed that under (unnatural) cryptographic assumptions, there exists a (single-party) task that can be efficiently solved using CDP, but is infeasible (not impossible) for information-theoretic DP. Yet, the existence of a stronger separation (i.e., one that implies the impossibility for information-theoretic DP) remains open (in particular, under more standard cryptographic assumptions).…”
Section: Additional Related Work On Computational Differential Privacymentioning
confidence: 99%
“…they showed that any two-party ε-differentially private protocol for the inner product, must incur an additive error of Ω( √ n/(e ε • log n)). 4 Computational Differential Privacy (CDP). Motivated by the above limitations on multiparty differential privacy, Beimel, Nissim, and Omri [2] and Mironov, Pandey, Reingold, and Vadhan [31] considered protocols that only guarantee a computational analog of differential privacy.…”
Section: Introductionmentioning
confidence: 99%