2008
DOI: 10.1109/tsp.2008.927075
|View full text |Cite
|
Sign up to set email alerts
|

A Fresh Look at the Bayesian Bounds of the Weiss-Weinstein Family

Abstract: Abstract-Minimal bounds on the mean square error (MSE) are generally used in order to predict the best achievable performance of an estimator for a given observation model. In this paper, we are interested in the Bayesian bound of the Weiss-Weinstein family. Among this family, we have Bayesian Cramér-Rao bound, the Bobrovsky-MayerWolf-Zakaï bound, the Bayesian Bhattacharyya bound, the Bobrovsky-Zakaï bound, the Reuven-Messer bound, and the Weiss-Weinstein bound. We present a unification of all these minimal bo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
44
0
2

Year Published

2012
2012
2023
2023

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 71 publications
(46 citation statements)
references
References 62 publications
0
44
0
2
Order By: Relevance
“…The UCRB is the lowest bound in both figures which means that, obviously, if the model is correctly specified, estimators can provide better performance in term of the MSE than in the misspecified case. Note that, the CRB which takes into account the bias of the MLE and the MCRB exhibit a threshold effect[14][15][16]. The origin of this phenomenom is only due to the bias of the estimators as we can see the difference between the UCRB and the CRB in both figures.Fig.…”
mentioning
confidence: 75%
“…The UCRB is the lowest bound in both figures which means that, obviously, if the model is correctly specified, estimators can provide better performance in term of the MSE than in the misspecified case. Note that, the CRB which takes into account the bias of the MLE and the MCRB exhibit a threshold effect[14][15][16]. The origin of this phenomenom is only due to the bias of the estimators as we can see the difference between the UCRB and the CRB in both figures.Fig.…”
mentioning
confidence: 75%
“…This theorem applies to any viable inner product space. It is the basis for [7], and proof is given in [8].…”
Section: B Minimum Norm Theoremmentioning
confidence: 99%
“…Several bounds derive from the covariance inequality and the minimum norm theorem via varied choices of score function and constraints [6], [8]. Let ζ(x) = θ f (x) − μ g and define the score function as…”
Section: Extensions To Other Misspecified Boundsmentioning
confidence: 99%
See 1 more Smart Citation
“…Our motivation comes from the fact that, among the Bayesian bounds, the WeissWeinstein bound is known to be one of the tightest [25]. So, one can expect that the combination of these bounds will lead to a bound tighter than the HBB.…”
Section: Introductionmentioning
confidence: 99%