2016
DOI: 10.3390/info7010015
|View full text |Cite
|
Sign up to set email alerts
|

Information Extraction Under Privacy Constraints

Abstract: A privacy-constrained information extraction problem is considered where for a pair of correlated discrete random variables (X, Y) governed by a given joint distribution, an agent observes Y and wants to convey to a potentially public user as much information about Y as possible while limiting the amount of information revealed about X. To this end, the so-called rate-privacy function is investigated to quantify the maximal amount of information (measured in terms of mutual information) that can be extracted f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
87
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
3
1

Relationship

3
6

Authors

Journals

citations
Cited by 63 publications
(97 citation statements)
references
References 50 publications
1
87
0
Order By: Relevance
“…Other quantities from the information-theoretic literature have been used to quantify privacy and utility. For example, Asoodeh et al [14] and Calmon et al [16] used estimation-theoretic tools to characterize fundamental limits of privacy. Liao et al [37,38] explored the PUT within a hypothesis testing framework.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Other quantities from the information-theoretic literature have been used to quantify privacy and utility. For example, Asoodeh et al [14] and Calmon et al [16] used estimation-theoretic tools to characterize fundamental limits of privacy. Liao et al [37,38] explored the PUT within a hypothesis testing framework.…”
Section: Related Workmentioning
confidence: 99%
“…The feasibility of this goal depends on several factors including the chosen privacy and utility metric, as well as the topology and distribution of the data. The informationtheoretic approach to privacy, and notably the results by Sankar et al [10,11], Issa et al [12,13], Asoodeh et al [14,15], Calmon et al [16,17], among others, seek to quantify the best possible PUT for any privacy mechanism. In those works, information-theoretic quantities, such as mutual information and maximal leakage [12,13], have been used to characterize privacy, and bounds on the fundamental PUT were derived under assumptions on the distribution of the data.…”
Section: Introductionmentioning
confidence: 99%
“…Z|H (· | 1) − p Z|H (· | 0) T V ≤ 2I(H; Z) ≤ 2I(H; X | G • ) + ,and the lower bound in (8) follows from(28).We next prove the upper bound. Fix a subset of Γ ∈ Z s , and consider a p Z|X such thatp Z|X (z | x) = A, x ∈ I + , z ∈ Γ B, x ∈ I − , z ∈ Γ , p Z|X (z | x) = B, x ∈ I + , z ∈ Γ c A, x ∈ I − , z ∈ Γ cfor some A, B ≥ 0.…”
mentioning
confidence: 86%
“…This specific type of privacy-utility trade-off (PUT), and the optimal privacy mechanisms associated to it, has been investigated for several measures of privacy and utility, see, for example, [14,17,20,22]. These investigations often rely on the implicit assumption that the data distribution is, for the most part, known.…”
Section: Problem Setupmentioning
confidence: 99%