1965
DOI: 10.1109/tit.1965.1053768
|View full text |Cite
|
Sign up to set email alerts
|

The convolution inequality for entropy powers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
353
0

Year Published

1980
1980
2017
2017

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 296 publications
(358 citation statements)
references
References 2 publications
2
353
0
Order By: Relevance
“…We first provide a direct proof of this relation, and then use it to unify and simplify existing proofs of the EPI via FI and via MMSE. In particular, two essential ingredients, namely, Fisher's information inequality [7], [12], and a related inequality for MMSE [9], [10], will be shown to be equivalent from (10).…”
Section: Proofs Of the Epi Via Fi And Mmse Revisitedmentioning
confidence: 99%
See 2 more Smart Citations
“…We first provide a direct proof of this relation, and then use it to unify and simplify existing proofs of the EPI via FI and via MMSE. In particular, two essential ingredients, namely, Fisher's information inequality [7], [12], and a related inequality for MMSE [9], [10], will be shown to be equivalent from (10).…”
Section: Proofs Of the Epi Via Fi And Mmse Revisitedmentioning
confidence: 99%
“…The FI representation (25) can be used similarly, yielding essentially Stam's proof of the EPI [6], [7]. These proofs are sketched below.…”
Section: Equivalent Integral Representations Of Differential Entropymentioning
confidence: 99%
See 1 more Smart Citation
“…This concavity has a classical analogue, first proved by Costa in [8]. Later, Dembo [12,11] simplified the proof, by an argument based on the so-called Blachman-Stam inequality [4]. More recently, Villani [44] gave a direct proof of the same inequality.…”
Section: Proofs Of Theorem and Theoremmentioning
confidence: 92%
“…This technique can be 1948). There are some known relations that connect the two information concepts [29][30][31]. Shannon's entropy can be, but is not always, the thermodynamic, Boltzmann entropy [28].…”
Section: Introductionmentioning
confidence: 99%