Abstract:The relative entropy and the chi-squared divergence are fundamental divergence measures in information theory and statistics. This paper is focused on a study of integral relations between the two divergences, the implications of these relations, their information-theoretic applications, and some generalizations pertaining to the rich class of f-divergences. Applications that are studied in this paper refer to lossless compression, the method of types and large deviations, strong data–processing inequalities, … Show more
“…Here, we propose that comparing eigenmodes is a more generic and appropriate method for describing a difference between Hermitian and non-Hermitian systems. For this reason, we introduce the notion of relative entropy (also known as the Kullback–Leibler divergence) [ 26 , 27 ]. The relative entropy is a measure of the difference in two probability distributions on the probability space.…”
Section: Relative Entropy For Hermitian and Non-hermitian Systemsmentioning
confidence: 99%
“…Accordingly, we need to overcome this problem considering a different disparity, not of eigenvalues but of the eigenmodes in open systems. For this, we exploit the notion of relative entropy [ 26 , 27 ], which is typically used to measure a difference given in the form of two probability distribution functions, to quantify the difference between the Hermitian and non-Hermitian eigenmodes. We could expect that adapting the relative entropy can be advantageous for future works on optical microcavity as in the cases of information theory.…”
We employ the relative entropy as a measure to quantify the difference of eigenmodes between Hermitian and non-Hermitian systems in elliptic optical microcavities. We have found that the average value of the relative entropy in the range of the collective Lamb shift is large, while that in the range of self-energy is small. Furthermore, the weak and strong interactions in the non-Hermitian system exhibit rather different behaviors in terms of the relative entropy, and thus it displays an obvious exchange of eigenmodes in the elliptic microcavity.
“…Here, we propose that comparing eigenmodes is a more generic and appropriate method for describing a difference between Hermitian and non-Hermitian systems. For this reason, we introduce the notion of relative entropy (also known as the Kullback–Leibler divergence) [ 26 , 27 ]. The relative entropy is a measure of the difference in two probability distributions on the probability space.…”
Section: Relative Entropy For Hermitian and Non-hermitian Systemsmentioning
confidence: 99%
“…Accordingly, we need to overcome this problem considering a different disparity, not of eigenvalues but of the eigenmodes in open systems. For this, we exploit the notion of relative entropy [ 26 , 27 ], which is typically used to measure a difference given in the form of two probability distribution functions, to quantify the difference between the Hermitian and non-Hermitian eigenmodes. We could expect that adapting the relative entropy can be advantageous for future works on optical microcavity as in the cases of information theory.…”
We employ the relative entropy as a measure to quantify the difference of eigenmodes between Hermitian and non-Hermitian systems in elliptic optical microcavities. We have found that the average value of the relative entropy in the range of the collective Lamb shift is large, while that in the range of self-energy is small. Furthermore, the weak and strong interactions in the non-Hermitian system exhibit rather different behaviors in terms of the relative entropy, and thus it displays an obvious exchange of eigenmodes in the elliptic microcavity.
“…We need the following lemma originally proved by Audenart in the quantum setting [ 28 ]. It is based on a differential relationship between the skew divergence [ 12 ] and the [ 15 ] (see [ 29 , 30 ]).…”
We consider a sub-class of the f-divergences satisfying a stronger convexity property, which we refer to as strongly convex, or κ-convex divergences. We derive new and old relationships, based on convexity arguments, between popular f-divergences.
“…Let P and Q be distributions defined on a common probability space that have densities p and q with respect to a dominating measure . The relative entropy (or Kullback–Leibler divergence) is defined according to and the chi-squared divergence is defined as Both of these divergences can be seen as special cases of the general class of f -divergence measures and there exists a rich literature on comparisons between different divergences [ 8 , 26 , 27 , 28 , 29 , 30 , 31 , 32 ]. The chi-squared divergence can also be viewed as the squared distance between and .…”
Section: Bounds On Mutual Informationmentioning
confidence: 99%
“…The chi-square can also be interpreted as the first non-zero term in the power series expansion of the relative entropy ([ 26 ], [Lemma 4]). More generally, the chi-squared divergence provides an upper bound on the relative entropy via The proof of this inequality follows straightforwardly from Jensen’s inequality and the concavity of the logarithm; see [ 27 , 31 , 32 ] for further refinements.…”
This paper explores some applications of a two-moment inequality for the integral of the rth power of a function, where 0<r<1. The first contribution is an upper bound on the Rényi entropy of a random vector in terms of the two different moments. When one of the moments is the zeroth moment, these bounds recover previous results based on maximum entropy distributions under a single moment constraint. More generally, evaluation of the bound with two carefully chosen nonzero moments can lead to significant improvements with a modest increase in complexity. The second contribution is a method for upper bounding mutual information in terms of certain integrals with respect to the variance of the conditional density. The bounds have a number of useful properties arising from the connection with variance decompositions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.