2011
DOI: 10.1109/tit.2011.2137353
|View full text |Cite
|
Sign up to set email alerts
|

On Pairs of $f$-Divergences and Their Joint Range

Abstract: We compare two f -divergences and prove that their joint range is the convex hull of the joint range for distributions supported on only two points. Some applications of this result are given.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
39
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 39 publications
(42 citation statements)
references
References 12 publications
1
39
0
Order By: Relevance
“…[1,Lemma 6]. The improvement by a factor of 2 from (72) to (68) is also observed in [17,Remark 33], where the authors mention that our result [1, Theorem 9] (see (71)) in the conference version of this paper can be improved by a factor of 2 by using (68) instead (72). We believe the authors of [17] may have missed our result [1, Theorem 10] (see Corollary 9) in the conference paper, which presents precisely this improvement by a factor of 2.…”
Section: B Proofs Of Theorems 8 and 10supporting
confidence: 78%
See 1 more Smart Citation
“…[1,Lemma 6]. The improvement by a factor of 2 from (72) to (68) is also observed in [17,Remark 33], where the authors mention that our result [1, Theorem 9] (see (71)) in the conference version of this paper can be improved by a factor of 2 by using (68) instead (72). We believe the authors of [17] may have missed our result [1, Theorem 10] (see Corollary 9) in the conference paper, which presents precisely this improvement by a factor of 2.…”
Section: B Proofs Of Theorems 8 and 10supporting
confidence: 78%
“…It is worth mentioning that a systematic method of deriving optimal bounds between any pair of f -divergences is given by the Harremoës-Vajda joint range [71]. However, we cannot use this technique to derive lower bounds on KL divergence using χ 2 -divergence since no such general lower bound exists (when both input distributions vary) [21,Section 7.3].…”
Section: A Bounds On F -Divergences Using χ 2 -Divergencementioning
confidence: 99%
“…The most famous example is the Pinsker inequality [20], which shows that the KL divergence bounds from above the squared total deviation. More recently, the comprehensive studies of Sason and Verdú [8], Harremoës and Vajda [9] and Reid and Williamson [10] extended this result to a broader set of f -divergences inequalities. In addition, Zhang [21] introduced an important Bregman inequality in the context of Statistical learning; he showed that the KL divergence bounds from above the squared excess risk associated with the 0-1 loss, and by that controls this performance measure.…”
Section: Previous Workmentioning
confidence: 90%
“…A more direct approach, in the spirit of the joint-range idea of Harremoës and Vajda [HV11], is to find (or bound) the best possible data-processing function F I defined as follows.…”
Section: Introductionmentioning
confidence: 99%