2019
DOI: 10.1109/tsp.2018.2883915
|View full text |Cite
|
Sign up to set email alerts
|

Cram$\acute{\text{e}}$r–Rao Bound for Constrained Parameter Estimation Using Lehmann-Unbiasedness

Abstract: The constrained Cramér-Rao bound (CCRB) is a lower bound on the mean-squared-error (MSE) of estimators that satisfy some unbiasedness conditions. Although the CCRB unbiasedness conditions are satisfied asymptotically by the constrained maximum likelihood (CML) estimator, in the nonasymptotic region these conditions are usually too strict and the commonly-used estimators, such as the CML estimator, do not satisfy them. Therefore, the CCRB may not be a lower bound on the MSE matrix of such estimators. In this pa… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
29
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 26 publications
(31 citation statements)
references
References 61 publications
0
29
0
Order By: Relevance
“…where the matrix ∂ f ( ω ) /∂ ω ∈ R P×K ω has full row rank ( P ), which is equivalent to requiring that the constraints are not redundant, it leads to the so-called constrained CRB [41][42][43][44] . In the case of mixed parameter vectors, it amounts to replace the LCs ( 13) with [29] E y ;θ…”
Section: Generalizations and Outlooksmentioning
confidence: 99%
“…where the matrix ∂ f ( ω ) /∂ ω ∈ R P×K ω has full row rank ( P ), which is equivalent to requiring that the constraints are not redundant, it leads to the so-called constrained CRB [41][42][43][44] . In the case of mixed parameter vectors, it amounts to replace the LCs ( 13) with [29] E y ;θ…”
Section: Generalizations and Outlooksmentioning
confidence: 99%
“…thus enabling to assess approximately the minimum distance between the estimated and the true signal subspace. Another option could have been to start with the constrained parameterization G = U D 1 /2 and to directly handle the orthonormality constraints U H U = I k with the the theory of constrained CRLBs [32]- [35] to obtain CRB(vec(U )), then deriving the same result as in ( 25) from π = vec(U U H ). This method is expected to yield the same result as in [7] from a different path of derivations.…”
Section: B Intrinsic Cramér-rao Boundsmentioning
confidence: 99%
“…(i) Assignment step: each θ i is assigned to the cluster S j whose center c j is the closest using the distance d M p,k,n , (ii) Update step: each new class center c j is computed using (34) and (35).…”
Section: B K-means++ On M Pknmentioning
confidence: 99%
“…Hence, the probability of detection increases. If the prior information is used, information theory tells us the accuracy of estimation must be improved [40][41][42]. In this case, we consider a series of distance constraints and velocity constraints for the multiple sources.…”
Section: E Inequality Constraints On Sourcementioning
confidence: 99%