2006
DOI: 10.1007/11761679_29
|View full text |Cite
|
Sign up to set email alerts
|

Our Data, Ourselves: Privacy Via Distributed Noise Generation

Abstract: Abstract. In this work we provide efficient distributed protocols for generating shares of random noise, secure against malicious participants. The purpose of the noise generation is to create a distributed implementation of the privacy-preserving statistical databases described in recent papers [14,4,13]. In these databases, privacy is obtained by perturbing the true answer to a database query by the addition of a small amount of Gaussian or exponentially distributed random noise. The computational power of e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

5
1,187
0
1

Year Published

2011
2011
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 1,265 publications
(1,193 citation statements)
references
References 26 publications
5
1,187
0
1
Order By: Relevance
“…However, under the standard assumption of a polynomial time adversary, there exist secure multiparty computation techniques (e.g. [42]) that allow to compute noisy aggregates in a distributed setting. In such a setting users maintain their own data and participate in a protocol in order to directly compute the noisy answer under differental privacy.…”
Section: Obtaining a Differentially Private Global Priormentioning
confidence: 99%
“…However, under the standard assumption of a polynomial time adversary, there exist secure multiparty computation techniques (e.g. [42]) that allow to compute noisy aggregates in a distributed setting. In such a setting users maintain their own data and participate in a protocol in order to directly compute the noisy answer under differental privacy.…”
Section: Obtaining a Differentially Private Global Priormentioning
confidence: 99%
“…The existence of such responses violates differential privacy, even if the probability of outputting one of these responses is small. To allow for this sort of situation one can consider a slightly weaker notion of differential privacy, called ( , δ)-differential privacy, that allows a small additive factor in the inequality [4].…”
Section: Summary Of Our Resultsmentioning
confidence: 99%
“…There are several reasons to consider such relaxations of differential privacy. In practice a computational notion of security suffices, yet the stringent notion of (statistical) differential privacy rules out some mechanisms that are intuitively secure: e.g., a differentially private mechanism implemented using pseudorandom noise in place of truly random noise, or a differentially private mechanism implemented using secure multi-party computation [4,11]. One might hope that by considering a relaxed definition we can circumvent limitations or impossibility results that arise in the information-theoretic setting, in the same way that computationally secure notions of encryption allow bypassing known bounds for perfectly secure encryption.…”
Section: Introductionmentioning
confidence: 99%
“…Such weaker guarantees can still be useful for other definitions of privacy [4,8]. Another relaxation, specific to the Lipschitz property, is to allow the reconstructed function F to be b-Lipschitz, that is, to require only |F (x) − F (y)| ≤ b · x − y 1 for all x, y ∈ {0, 1} d .…”
Section: Discussionmentioning
confidence: 99%