2019
DOI: 10.1109/tsp.2019.2951221
|View full text |Cite
|
Sign up to set email alerts
|

MMSE Bounds for Additive Noise Channels Under Kullback–Leibler Divergence Constraints on the Input Distribution

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
26
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
3

Relationship

3
3

Authors

Journals

citations
Cited by 18 publications
(28 citation statements)
references
References 29 publications
2
26
0
Order By: Relevance
“…where Example 11.4]. This expression together with the integral version of Jaffer's identity in (36) lead to the following simple relationship between the conditional expectation and the conditional cumulants.…”
Section: A the Univariate Casementioning
confidence: 99%
See 1 more Smart Citation
“…where Example 11.4]. This expression together with the integral version of Jaffer's identity in (36) lead to the following simple relationship between the conditional expectation and the conditional cumulants.…”
Section: A the Univariate Casementioning
confidence: 99%
“…5. There exists a large collection of lower bounds on estimation error for continuous distributions [34]- [36] of which the Cramér-Rao bound is arguably the most popular. However, lower bounds on the MMSE of other distributions are rare.…”
Section: Representations Of the Mmsementioning
confidence: 99%
“…This paper extends the bounds in [10] and [11] to a weighted sum of MMSEs. Similar to [10] and [11], the bounds derived in this paper are obtained by constraining the input distribution to be ε-close to a Gaussian reference distribution in terms of the KL divergence. The estimators that attain the upper bound are minimax robust against deviations from the assumed input distribution.…”
Section: Introductionmentioning
confidence: 75%
“…The practical implications of this phenomenon is that, our bounds hold for a larger set of distributions. This property of the proposed bounds has been discussed in detail in [11]. An example of such a prior distribution is a uniform distribution over a Kball.…”
Section: B Generalized Gaussian and Uniform Distributionsmentioning
confidence: 93%
See 1 more Smart Citation