2018 IEEE International Symposium on Information Theory (ISIT) 2018
DOI: 10.1109/isit.2018.8437735
|View full text |Cite
|
Sign up to set email alerts
|

The Utility Cost of Robust Privacy Guarantees

Abstract: Consider a data publishing setting for a data set with public and private features. The objective of the publisher is to maximize the amount of information about the public features in a revealed data set, while keeping the information leaked about the private features bounded. The goal of this paper is to analyze the performance of privacy mechanisms that are constructed to match the distribution learned from the data set. Two distinct scenarios are considered: (i) mechanisms are designed to provide a privacy… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
14
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 19 publications
(14 citation statements)
references
References 14 publications
0
14
0
Order By: Relevance
“…In this regard, [14], [16], [23] propose a training method based on the application of Generative Adversarial Networks (GAN) framework [24], which can be captured as a minimax game between two parties, as a datadriven approach to address this problem. As a related work, [25] analyzes the performance of privacy-preserving release mechanisms under partial knowledge of the input distribution for different privacy measures. It is important to note that the proposed privacy measure, i.e., T (•; •) guarantees pointwise and uniform privacy according to [25,Theorems 1,2].…”
Section: Evaluation Of the Optimal Utility-privacy Trade-offmentioning
confidence: 99%
See 1 more Smart Citation
“…In this regard, [14], [16], [23] propose a training method based on the application of Generative Adversarial Networks (GAN) framework [24], which can be captured as a minimax game between two parties, as a datadriven approach to address this problem. As a related work, [25] analyzes the performance of privacy-preserving release mechanisms under partial knowledge of the input distribution for different privacy measures. It is important to note that the proposed privacy measure, i.e., T (•; •) guarantees pointwise and uniform privacy according to [25,Theorems 1,2].…”
Section: Evaluation Of the Optimal Utility-privacy Trade-offmentioning
confidence: 99%
“…As a related work, [25] analyzes the performance of privacy-preserving release mechanisms under partial knowledge of the input distribution for different privacy measures. It is important to note that the proposed privacy measure, i.e., T (•; •) guarantees pointwise and uniform privacy according to [25,Theorems 1,2]. An extension of the current work is to address the utility-privacy trade-off under the privacy measure T (X; U ) when only a limited number of observed data samples are available to the release mechanism.…”
Section: Evaluation Of the Optimal Utility-privacy Trade-offmentioning
confidence: 99%
“…The feasible ball B D (x) around x is defined in (9). For the distribution independent PUT in (13), we have…”
Section: A Proof Of Theoremmentioning
confidence: 99%
“…The exact nature of the privacy-utility tradeoff (PUT) will depend to varying degrees on the distribution of the underlying data, as well as the chosen metrics (e.g., differential privacy [1], mutual information (MI) [2], [3], f -divergencebased leakage measures [4], maximal leakage (MaxL) [5]). Furthermore, most information-theoretic PUTs capture utility as a statistical average of desired measures of fidelity [6]- [9]. This, in turn, simplifies the PUT to a single-letter optimization for independent and identically distributed (i.i.d.)…”
Section: Introductionmentioning
confidence: 99%
“…Some examples are Arimoto's MI [6], Sibson's MI [7], and Csiszár's MI [8]. Recently, new information leakage measures have been proposed in the privacy-utility trade-off (PUT) problem, such as finformation [9] and f -leakage [10], as privacy measures. In addition to these measures, by assuming a "guessing" adversary, information leakage measures that have operational meanings have been proposed.…”
Section: Introductionmentioning
confidence: 99%