2009 IEEE International Symposium on Information Theory 2009
DOI: 10.1109/isit.2009.5205651
|View full text |Cite
|
Sign up to set email alerts
|

Mismatched estimation and relative entropy

Abstract: Abstract-A random variable with distribution is observed in Gaussian noise and is estimated by a mismatched minimum meansquare estimator that assumes that the distribution is , instead of . This paper shows that the integral over all signal-to-noise ratios (SNRs) of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy (in nats). This representation of relative entropy can be generalized to nonreal-valued random variables, and can be particularized to give n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
53
0

Year Published

2009
2009
2021
2021

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 35 publications
(54 citation statements)
references
References 30 publications
(53 reference statements)
1
53
0
Order By: Relevance
“…This bias is a type of random error, as opposed to a systematic error. Any bias ensures that the minimum error is not achieved for the wrong model, and borrowing a term from financial decision theory 27 and signal processing, 28 we shall refer to the positive deviation from the minimum error as regret .…”
Section: Theorymentioning
confidence: 99%
“…This bias is a type of random error, as opposed to a systematic error. Any bias ensures that the minimum error is not achieved for the wrong model, and borrowing a term from financial decision theory 27 and signal processing, 28 we shall refer to the positive deviation from the minimum error as regret .…”
Section: Theorymentioning
confidence: 99%
“…Using the identity defined in (7) and the definition of entropy defined above, the mutual information I(X; Y ) becomes…”
Section: B Vector Poisson Channelmentioning
confidence: 99%
“…Second, scaling the support of a Poisson random variable (integers) does not result in another Poisson random variable. This is in contrast to Gaussian channels which are relatively well-studied from both information-theoretic and estimation-theoretic aspects [5], [6], [7]. The Poisson channel has the advantage in thought experiments such as ours of simple conceptual models for data acquisition based on counting (photons, say) and switches that either allow counts to either pass through or be discarded.…”
Section: Introductionmentioning
confidence: 99%
“…This connection was made in the work of Guo, Shamai, and Verdú in [3]. Extensions of the I-MMSE relation were investigated in [8][9][10][11][12][13][14][15][16], and applications have been established, e.g., in optimal power allocation [17] and monotonicity of non-Gaussianness [18]. Our work is inscribed within this literature.…”
Section: 2 Related Literaturementioning
confidence: 99%