2000
DOI: 10.1016/s0898-1221(00)00089-4
|View full text |Cite
|
Sign up to set email alerts
|

Some upper bounds for relative entropy and applications

Abstract: In this paper we derive some upper bounds for the relative entropy D(p q) of two probability distribution and apply them to mutual information and entropy mapping. To achieve this we use an inequality for the logarithm function, (2.3) below, and some classical inequalities such as the Kantorovič Inequality and Diaz-Metcalf Inequality.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
9
0
1

Year Published

2013
2013
2023
2023

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 22 publications
(10 citation statements)
references
References 6 publications
0
9
0
1
Order By: Relevance
“…Our main contribution in Proposition 4 is the upper bound (the first term of the upper bound) for the KL-divergence of two bernoulli random variables. The last term of the upper bound is a direct derivation from the upper bounds in [6]. Our result in Proposition 4 shows a factor of 2 improvement in the leading term of the upper bound.…”
Section: Algorithms Description and Analysismentioning
confidence: 67%
“…Our main contribution in Proposition 4 is the upper bound (the first term of the upper bound) for the KL-divergence of two bernoulli random variables. The last term of the upper bound is a direct derivation from the upper bounds in [6]. Our result in Proposition 4 shows a factor of 2 improvement in the leading term of the upper bound.…”
Section: Algorithms Description and Analysismentioning
confidence: 67%
“…A contribuição da variabilidade da entropia é uma teoria moderna do conhecimento que vem sendo aplicada em diversas áreas da ciência, tais como em hidrologia, de acordo com , para a matemática, segundo Dragomir et al (2000), na economia, conforme Kaberger et al (2001), na ecologia, de acordo com Ricotta (2001), na climatologia, por Kawachi et al (2001), e na medicina, de acordo com Montaño et al (2001).…”
Section: Introductionunclassified
“…The Shannon entropy has since been employed in numerous areas (Singh and Rajagopal, 1987), such as mathematics (Dragomir et al, 2000), economics (Kaberger and Mansson, 2001), ecology (Ricotta, 2002), climatology (Kawachi et al, 2001), medicine (Montaño et al, 2001) and hydrology (Singh, 1997). One measure of uncertainty or disorder of a variable is entropy.…”
Section: Introductionmentioning
confidence: 99%