2015
DOI: 10.1109/tit.2014.2383379
|View full text |Cite
|
Sign up to set email alerts
|

Entropy Power Inequality for the Rényi Entropy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

3
84
0

Year Published

2015
2015
2020
2020

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 74 publications
(90 citation statements)
references
References 14 publications
3
84
0
Order By: Relevance
“…Madiman [11] used basic information theoretic relations to prove the submodularity of the entropy of independent sums and found accordingly upper bounds on the discrete and differential entropy of sums. Though, in its general form, the problem of upper bounding the differential entropy of independent sums is not always possible (proposition 4, [2]), several results are known in particular settings. Cover et al [12] solved the problem of maximizing the differential entropy of the sum of dependent RVs having the same marginal log-concave densities.…”
Section: -mentioning
confidence: 99%
See 2 more Smart Citations
“…Madiman [11] used basic information theoretic relations to prove the submodularity of the entropy of independent sums and found accordingly upper bounds on the discrete and differential entropy of sums. Though, in its general form, the problem of upper bounding the differential entropy of independent sums is not always possible (proposition 4, [2]), several results are known in particular settings. Cover et al [12] solved the problem of maximizing the differential entropy of the sum of dependent RVs having the same marginal log-concave densities.…”
Section: -mentioning
confidence: 99%
“…The EPI states that given two real independent RVs X, Z such that h(X), h(Z) and h(X + Z) exist, then (Corollary 3, [2])…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…Remark 4.6. Recent work of Bobkov and Chistyakov [4] and of Wang and Madiman [22] has studied formulations of Shannon's Entropy Power Inequality (EPI) for Rényi entropy of the convolution of probability densities f i in R d . To be specific, [4], Theorem 1, shows that for q > 1, the Rényi entropy power N R,q satisfies…”
Section: For Q > 1 If the Rényi Entropy Is Concave Then So Is The Tmentioning
confidence: 99%
“…Using this fact, the authors of [28] applied the entropy power inequality for the Shannon entropy. For the Rényi α-entropy, a version of entropy power inequalities was given in [57] but only for α ≥ 1. The problem of extending the entropy power inequality to orders 0 < α < 1 remains open.…”
Section: Entanglement Criteria For a Multipartite Quantum Systemmentioning
confidence: 99%