2020
DOI: 10.1109/tit.2019.2939472
|View full text |Cite
|
Sign up to set email alerts
|

On the Robustness of Information-Theoretic Privacy Measures and Mechanisms

Abstract: Consider a data publishing setting for a dataset composed by both private and non-private features. The publisher uses an empirical distribution, estimated from n i.i.d. samples, to design a privacy mechanism which is applied to new fresh samples afterward. In this paper, we study the discrepancy between the privacy-utility guarantees for the empirical distribution, used to design the privacy mechanism, and those for the true distribution, experienced by the privacy mechanism in practice. We first show that, f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 43 publications
(29 citation statements)
references
References 79 publications
0
26
0
Order By: Relevance
“…Following this observation, we introduced a large family of optimization problems, which we call bottleneck problems, by replacing mutual information in IB and PF with Arimoto's mutual information [21] or f -information [22]. Invoking results from [31], [32], we also demonstrated that these information measures are in general easier to estimate from data than mutual information. Similar to IB and PF, the bottleneck problems were shown to be fully characterized by boundaries of a two-dimensional convex set parameterized by two real-valued non-negative functions Φ and Ψ.…”
Section: Summary and Concluding Remarksmentioning
confidence: 94%
See 2 more Smart Citations
“…Following this observation, we introduced a large family of optimization problems, which we call bottleneck problems, by replacing mutual information in IB and PF with Arimoto's mutual information [21] or f -information [22]. Invoking results from [31], [32], we also demonstrated that these information measures are in general easier to estimate from data than mutual information. Similar to IB and PF, the bottleneck problems were shown to be fully characterized by boundaries of a two-dimensional convex set parameterized by two real-valued non-negative functions Φ and Ψ.…”
Section: Summary and Concluding Remarksmentioning
confidence: 94%
“…These bounds can then be used to shed light on the de facto guarantee of the bottleneck problems. Relying on [ 34 ] (Theorem 1), one can obtain that the gaps between the measures , , and computed on empirical distributions and the true one scale as where n is the number of samples. This is in contrast with mutual information for which the similar upper bound scales as as shown in [ 33 ].…”
Section: Family Of Bottleneck Problemsmentioning
confidence: 99%
See 1 more Smart Citation
“…Quantifying this information leakage is important in order to limit this. Different notions of privacy leakage have been proposed to capture the capacity of adversaries to estimate private information, for example, Shannon's mutual information, differential privacy, among others [159], as well as different leakage measures. In that sense, privacy can, under careful control, tolerate some leakage to get some utility.…”
Section: Privacymentioning
confidence: 99%
“…To apply this test in practice an upper bound on ε α is needed, so to maximize the power of a provably correct finite-sample test we seek upper bounds on Pr[V ≥ ε] which are meaningful (less than 1) for ε as small as possible. Equivalently, tight control on ε reduces the number of samples needed to obtain a given level of significance, which is of importance in areas as disparate as high-dimensional statistics [4], combinatorial constructions in complexity theory [5], and private machine learning [6].…”
Section: Introductionmentioning
confidence: 99%