2000
DOI: 10.1109/18.850703
|View full text |Cite
|
Sign up to set email alerts
|

Some inequalities for information divergence and related measures of discrimination

Abstract: Inequalities which connect information divergence with other measures of discrimination or distance between probability distributions are used in information theory and its applications to mathematical statistics, ergodic theory and other scientific fields. We suggest new inequalities of this type, often based on underlying identities. As a consequence we obtain certain improvements of the well known Pinsker inequality. Our study depends on two measures of discrimination, called capacitory discrimination and t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

3
164
0

Year Published

2005
2005
2019
2019

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 210 publications
(167 citation statements)
references
References 25 publications
3
164
0
Order By: Relevance
“…One way to proceed in this direction is to empirically show a constant factor relationship between these two measurements. In an earlier work by Topsøe [37] it was shown that Jensen-Shannon divergence (referred to as capacitory discrimination) behaves similarly with the triangle divergence (triangular discrimination):…”
Section: Transforming Divergencesmentioning
confidence: 96%
“…One way to proceed in this direction is to empirically show a constant factor relationship between these two measurements. In an earlier work by Topsøe [37] it was shown that Jensen-Shannon divergence (referred to as capacitory discrimination) behaves similarly with the triangle divergence (triangular discrimination):…”
Section: Transforming Divergencesmentioning
confidence: 96%
“…It has been shown that √ JS satisfies the triangle inequality and that it is an Hilbertian metric [24], [25].…”
Section: A the Jensen-shannon Divergencementioning
confidence: 99%
“…Of course, this includes the denormalized Shannon entropy (3) as a particular case (for q = 1). Although, for the Shannon entropy case, part of the proof is in [27], [25], [23], we present a general proof here.…”
Section: B Jensen-shannon and Tsallis Kernelsmentioning
confidence: 99%
“…It can be shown that that √ JS satisfies the triangle inequality and is a Hilbertian metric 2 (Endres & Schindelin, 2003;Topsøe, 2000), which has motivated its use in kernel-based machine learning.…”
Section: Jensen-shannon (Js) Divergence Consider a Classification Promentioning
confidence: 99%
“…Of course, this includes the denormalized Shannon entropy as a particular case, corresponding to q = 1. Partial proofs are given by Berg et al (1984), Topsøe (2000), and Cuturi et al (2005); we present here a complete proof.…”
Section: Jensen-shannon and Tsallis Kernelsmentioning
confidence: 99%