2004
DOI: 10.1109/tit.2004.831784
|View full text |Cite
|
Sign up to set email alerts
|

Second-Order Asymptotics of Mutual Information

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
107
2
2

Year Published

2007
2007
2018
2018

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 127 publications
(114 citation statements)
references
References 12 publications
3
107
2
2
Order By: Relevance
“…As for the entropy formula (1), when and are not defined on but on some measurable space (or are not second order), the main formula still holds provided that in (55) In addition to (2) giving the mutual information between a random variable and its Gaussian contaminated version, a relationship between mutual information and estimation had been found in [19] (65)…”
Section: Lemmamentioning
confidence: 99%
See 1 more Smart Citation
“…As for the entropy formula (1), when and are not defined on but on some measurable space (or are not second order), the main formula still holds provided that in (55) In addition to (2) giving the mutual information between a random variable and its Gaussian contaminated version, a relationship between mutual information and estimation had been found in [19] (65)…”
Section: Lemmamentioning
confidence: 99%
“…where (113) is the (classical) relative entropy in (19). A representation for the free relative entropy (99) in terms of the R-transforms of and would be useful.…”
Section: Free Relative Entropymentioning
confidence: 99%
“…In [7], Prelov and Verdú determined the coefficients c 1 and c 2 for the so-called proper-complex constellations introduced by Neeser and Massey [8], which satisfy…”
Section: A Coded Modulationmentioning
confidence: 99%
“…To name a few, this has been studied earlier in [28] which provides capacity expressions under "weak" input signals, in [29] which focuses on low-SNR second order asymptotics of channel capacity, and in [30] which discusses low SNR asymptotics using the relation between mutual information and minimum mean square error (MMSE) over Gaussian channels. In this context, the contribution of this work is characterizing the low-SNR asymptotic capacity of MIMO IM-DD channels and shedding light on the capacity achieving input in this regime.…”
Section: Introductionmentioning
confidence: 99%