2014 IEEE 27th Computer Security Foundations Symposium 2014
DOI: 10.1109/csf.2014.35
|View full text |Cite
|
Sign up to set email alerts
|

Differential Privacy: An Economic Method for Choosing Epsilon

Abstract: Differential privacy is becoming a gold standard notion of privacy; it offers a guaranteed bound on loss of privacy due to release of query results, even under worst-case assumptions. The theory of differential privacy is an active research area, and there are now differentially private algorithms for a wide range of problems.However, the question of when differential privacy works in practice has received relatively little attention. In particular, there is still no rigorous method for choosing the key parame… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
130
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 197 publications
(131 citation statements)
references
References 39 publications
1
130
0
Order By: Relevance
“…One of them assumes an online trusted third party, similar to our CSP-centered approach. The research in [20], [12] addresses the issue of how to choose the value of privacy budget ε in practice. The former study argues that even for the same value of ε, the privacy guarantees enforced by differential privacy are different based on the domain of the data attributes and supported query types.…”
Section: Related Workmentioning
confidence: 99%
“…One of them assumes an online trusted third party, similar to our CSP-centered approach. The research in [20], [12] addresses the issue of how to choose the value of privacy budget ε in practice. The former study argues that even for the same value of ε, the privacy guarantees enforced by differential privacy are different based on the domain of the data attributes and supported query types.…”
Section: Related Workmentioning
confidence: 99%
“…In our system, we choose to use the Laplacian mechanism, and the parameters that determine the Laplace distribution will be computed according to the results published in [23].…”
Section: Inputsmentioning
confidence: 99%
“…The reason is, the curator randomly selects the encryption functions with the index and send to the respective parties in a temporal order, the curator itself is not permitted to remember this random numbers. In consequence, the curator can re-assemble the data blocks just For NPDDP algorithm based on the principle of differential privacy, it can be easily proved as in [23] that the privacy of the data can be well preserved within the range of the required accuracy and perturbed error. Therefore, our design goals are achieved.…”
Section: Inputsmentioning
confidence: 99%
“…Differential privacy can be explained in the following: the mechanism adds noises to the query results or the data itself to preserve privacy of individual with a given privacy budget [3]. The dataset is then mapped into a histogram as shown in Fig.…”
Section: Introductionmentioning
confidence: 99%