2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton) 2013
DOI: 10.1109/allerton.2013.6736575
|View full text |Cite
|
Sign up to set email alerts
|

Bounds on inference

Abstract: Abstract-Lower bounds for the average probability of error of estimating a hidden variable X given an observation of a correlated random variable Y , and Fano's inequality in particular, play a central role in information theory. In this paper, we present a lower bound for the average estimation error based on the marginal distribution of X and the principal inertias of the joint distribution matrix of X and Y . Furthermore, we discuss an information measure based on the sum of the largest principal inertias, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
46
0

Year Published

2013
2013
2018
2018

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 33 publications
(46 citation statements)
references
References 30 publications
0
46
0
Order By: Relevance
“…We prove that the smallest principal inertia component ( [3], [4]) of p S,X plays a central role for achieving perfect privacy: If |X | ≤ |S|, then perfect privacy is achievable with I(X; Y ) > 0 if and only if the smallest principal inertia component of p S,X is 0. Since I(S; Y ) = 0 (perfect privacy) if and only if S ⊥ ⊥ Y , this fundamental result holds for any privacy metric where statistical independence implies perfect privacy.…”
Section: Introductionmentioning
confidence: 99%
“…We prove that the smallest principal inertia component ( [3], [4]) of p S,X plays a central role for achieving perfect privacy: If |X | ≤ |S|, then perfect privacy is achievable with I(X; Y ) > 0 if and only if the smallest principal inertia component of p S,X is 0. Since I(S; Y ) = 0 (perfect privacy) if and only if S ⊥ ⊥ Y , this fundamental result holds for any privacy metric where statistical independence implies perfect privacy.…”
Section: Introductionmentioning
confidence: 99%
“…If M is uniformly distributed and f is a one-bit function, e.g., one of the bits of the message, then the relationship between maximal correlation and the advantage follows readily by the work of Witsenhausen [8]. Applying the result by Calmon et al [9], the advantage for uniformly distributed M and general f is upper-bounded by ρ. A contribution of this paper is to extend this result on the relationship between maximal correlation and the advantage to scenarios in which the distribution of M is not fixed and Eve has access to some side information about the message.…”
Section: Introductionmentioning
confidence: 92%
“…Next we state a result relating maximal correlation and the χ 2 -divergence between the joint pmf and the product of the marginal pmfs, also known as χ 2 measure of correlation which follows directly from [9].…”
Section: A Properties Of Maximal Correlationmentioning
confidence: 99%
See 1 more Smart Citation
“…In this context, information-theoretic metrics for privacy are naturally well suited. In fact, the adversarial model determines the appropriate information metric: an estimating adversary that minimizes mean square error is captured by χ 2 -squared measures [40], a belief refining adversary is captured by MI [39], an adversary that can make a hard MAP decision for a specific set of private features is captured by the Arimoto MI of order ∞ [58,59], and an adversary that can guess any function of the private features is captured by the maximal (over all distributions of the dataset for a fixed support) Sibson information of order ∞ [55,57].…”
Section: Related Workmentioning
confidence: 99%