2011
DOI: 10.1109/tit.2010.2090193
|View full text |Cite
|
Sign up to set email alerts
|

Information Theoretic Proofs of Entropy Power Inequalities

Abstract: While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon's entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum mean-square error (MMSE), which are derived from de Bruijn's identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingr… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
57
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 124 publications
(59 citation statements)
references
References 101 publications
0
57
0
Order By: Relevance
“…Under this finite logarithmic constraint, the differential entropy of X is well defined and is such that −∞ ≤ h(X) < +∞ (Proposition 1, [5]) and that of Y exists and is finite (Lemma 1, [5]). Also, when X ∈ L, the identity I(X + Z; Z) = h(X + Z) − h(X) always holds (Lemma 1, [5]).…”
Section: Resultsmentioning
confidence: 99%
See 4 more Smart Citations
“…Under this finite logarithmic constraint, the differential entropy of X is well defined and is such that −∞ ≤ h(X) < +∞ (Proposition 1, [5]) and that of Y exists and is finite (Lemma 1, [5]). Also, when X ∈ L, the identity I(X + Z; Z) = h(X + Z) − h(X) always holds (Lemma 1, [5]).…”
Section: Resultsmentioning
confidence: 99%
“…The result of Theorem 1 is more powerful that the IIE in Equation (5). Indeed, using the fact that h(Z) ≤ h(X + Z), inequality Equation (10) gives the looser inequality:…”
Section: -mentioning
confidence: 95%
See 3 more Smart Citations