2013 IEEE 54th Annual Symposium on Foundations of Computer Science 2013
DOI: 10.1109/focs.2013.54
|View full text |Cite
|
Sign up to set email alerts
|

Coupled-Worlds Privacy: Exploiting Adversarial Uncertainty in Statistical Data Privacy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
88
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 60 publications
(93 citation statements)
references
References 10 publications
1
88
0
Order By: Relevance
“…Several other works, most notably the Pufferfish and the coupled-worlds frameworks [15], [16], propose different stability constraints on the output distribution of privacypreserving mechanisms. Although they differ in what distributions are compared, their notion of closeness is the same as in (ǫ, δ)-DP.…”
Section: Differential Privacy and Its Flavorsmentioning
confidence: 99%
“…Several other works, most notably the Pufferfish and the coupled-worlds frameworks [15], [16], propose different stability constraints on the output distribution of privacypreserving mechanisms. Although they differ in what distributions are compared, their notion of closeness is the same as in (ǫ, δ)-DP.…”
Section: Differential Privacy and Its Flavorsmentioning
confidence: 99%
“…Our definition can be seen as an instantiation of this general framework. This is in contrast to other kinds of relaxations of differential privacy, which relax the worst-case assumptions on the prior beliefs of an attacker as in Bassily et al (12), or the worstcase collusion assumptions on collections of data analysts as in Kearns et al (13). Several works have also proposed assigning different differential privacy parameters to different individuals (see, e.g., ref.…”
Section: Significancementioning
confidence: 93%
“…Previous works mainly focus on the unbounded-DP case, and thus are not directly applicable to situations where the size of the dataset is public. Furthermore, previously considered adversarial priors are either uniform [13,2] or only allow for a fixed number of known entities [8,15]. Finally, very few results are known on how to design general mechanisms satisfying distributional variants of DP.…”
Section: Relation To Prior Workmentioning
confidence: 99%
“…Works on Differential-Privacy under Sampling [13], Crowd-Blending Privacy [8], Coupled-Worlds Privacy [2] or Outlier Privacy [15] have shown that if sufficiently many users are indistinguishable by a mechanism, and this mechanism operates on a dataset obtained through a robust sampling procedure, differential privacy can be satisfied with only little data perturbation. Our work differs in that we make no assumptions on the indistinguishability of different entities, and that our aim is to guarantee membership privacy rather than differential privacy.…”
Section: Relation To Prior Workmentioning
confidence: 99%
See 1 more Smart Citation