2020
DOI: 10.3390/e22050563
|View full text |Cite
|
Sign up to set email alerts
|

On Relations Between the Relative Entropy and χ2-Divergence, Generalizations and Applications

Abstract: The relative entropy and the chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of f-divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong data–processing inequalities, … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
21
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(24 citation statements)
references
References 51 publications
1
21
0
Order By: Relevance
“…Here, we propose that comparing eigenmodes is a more generic and appropriate method for describing a difference between Hermitian and non-Hermitian systems. For this reason, we introduce the notion of relative entropy (also known as the Kullback–Leibler divergence) [ 26 , 27 ]. The relative entropy is a measure of the difference in two probability distributions on the probability space.…”
Section: Relative Entropy For Hermitian and Non-hermitian Systemsmentioning
confidence: 99%
See 1 more Smart Citation
“…Here, we propose that comparing eigenmodes is a more generic and appropriate method for describing a difference between Hermitian and non-Hermitian systems. For this reason, we introduce the notion of relative entropy (also known as the Kullback–Leibler divergence) [ 26 , 27 ]. The relative entropy is a measure of the difference in two probability distributions on the probability space.…”
Section: Relative Entropy For Hermitian and Non-hermitian Systemsmentioning
confidence: 99%
“…Accordingly, we need to overcome this problem considering a different disparity, not of eigenvalues but of the eigenmodes in open systems. For this, we exploit the notion of relative entropy [ 26 , 27 ], which is typically used to measure a difference given in the form of two probability distribution functions, to quantify the difference between the Hermitian and non-Hermitian eigenmodes. We could expect that adapting the relative entropy can be advantageous for future works on optical microcavity as in the cases of information theory.…”
Section: Introductionmentioning
confidence: 99%
“…We need the following lemma originally proved by Audenart in the quantum setting [ 28 ]. It is based on a differential relationship between the skew divergence [ 12 ] and the [ 15 ] (see [ 29 , 30 ]).…”
Section: Skew Divergencesmentioning
confidence: 99%
“…Let P and Q be distributions defined on a common probability space that have densities p and q with respect to a dominating measure . The relative entropy (or Kullback–Leibler divergence) is defined according to and the chi-squared divergence is defined as Both of these divergences can be seen as special cases of the general class of f -divergence measures and there exists a rich literature on comparisons between different divergences [ 8 , 26 , 27 , 28 , 29 , 30 , 31 , 32 ]. The chi-squared divergence can also be viewed as the squared distance between and .…”
Section: Bounds On Mutual Informationmentioning
confidence: 99%
“…The chi-square can also be interpreted as the first non-zero term in the power series expansion of the relative entropy ([ 26 ], [Lemma 4]). More generally, the chi-squared divergence provides an upper bound on the relative entropy via The proof of this inequality follows straightforwardly from Jensen’s inequality and the concavity of the logarithm; see [ 27 , 31 , 32 ] for further refinements.…”
Section: Bounds On Mutual Informationmentioning
confidence: 99%