2010
DOI: 10.1198/jasa.2009.tm08651
|View full text |Cite
|
Sign up to set email alerts
|

A Statistical Framework for Differential Privacy

Abstract: One goal of statistical privacy research is to construct a data release mechanism that protects individual privacy while preserving information content. An example is a random mechanism that takes an input database X and outputs a random database Z according to a distribution Q n (·|X). Differential privacy is a particular privacy requirement developed by computer scientists in which Q n (·|X) is required to be insensitive to changes in one data point in X. This makes it difficult to infer from Z whether a giv… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
312
0

Year Published

2010
2010
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 326 publications
(313 citation statements)
references
References 37 publications
1
312
0
Order By: Relevance
“…The argument we have just given can be interpreted in terms hypothesis testing; see, for example, Wasserman and Zhou [62] for more discussion.…”
Section: Differential Privacymentioning
confidence: 99%
See 3 more Smart Citations
“…The argument we have just given can be interpreted in terms hypothesis testing; see, for example, Wasserman and Zhou [62] for more discussion.…”
Section: Differential Privacymentioning
confidence: 99%
“…Even more recently, Wasserman and Zhou [62] considered differentially-private nonparametric techniques. That work came to our attention after the initial version of this survey was written; we do not discuss it in detail here, except to note that certain nonparametric techniques, such as historgram-based estimation, are directly amenable to the sensitivity-based noise addition discussed above.…”
Section: Differential Privacy and Statisticsmentioning
confidence: 99%
See 2 more Smart Citations
“…Because quantifying the level of protection and the utility a given DLM provides is difficult (Lambert 1993), comparing DLMs (and thus choosing a method) is not straightforward. Indeed, the level of protection offered by a DLM usually depends on characteristics of the data being published and is usually only quantified with certain restrictions on how the data can be accessed (see, for example Wasserman and Zhou 2010). Measures of the utility, on the other hand, often depend on the intended purpose of the data.…”
Section: Introductionmentioning
confidence: 99%