Proceedings. (ICASSP '05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005.
DOI: 10.1109/icassp.2005.1416483
|View full text |Cite
|
Sign up to set email alerts
|

Intrinsic Variance Lower Bound (IVLB): An Extension of the Cramer-Rao Bound to Riemannian Manifolds

Abstract: We consider parametric statistical models in which the parameter space Θ is a connected Riemannian manifold. This mathematical structure on the parameter space subsumes, as special cases, submanifolds of Euclidean spaces appearing in parametric estimation scenarios with a priori smooth deterministic constraints, and quotient spaces (such as Grassmann manifolds) which arise in certain parametric estimation scenarios with ambiguities. The Riemannian structure on the parameter space Θ turns it into a metric space… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
13
0

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 19 publications
(16 citation statements)
references
References 14 publications
(25 reference statements)
1
13
0
Order By: Relevance
“…The problem of simultaneous range-only attitude and position estimation was addressed in a Maximum likelihood framework in [9] by formulating a constrained optimization problem on the Special Euclidean group SE(3) and resorting to generalized intrinsic gradient and Newton algorithms. The performance of the derived estimator is very close to the theoretical bounds provided by the Intrinsic Variance Lower Bound IVLB [10]. However, no convergence warranties were given and simulations revealed the existence of local minima.…”
Section: Introductionsupporting
confidence: 56%
See 2 more Smart Citations
“…The problem of simultaneous range-only attitude and position estimation was addressed in a Maximum likelihood framework in [9] by formulating a constrained optimization problem on the Special Euclidean group SE(3) and resorting to generalized intrinsic gradient and Newton algorithms. The performance of the derived estimator is very close to the theoretical bounds provided by the Intrinsic Variance Lower Bound IVLB [10]. However, no convergence warranties were given and simulations revealed the existence of local minima.…”
Section: Introductionsupporting
confidence: 56%
“…Assuming a static rigid body (Ṙ = 0 andṗ = 0), the error dynamics can 47th IEEE CDC, Cancun, Mexico, Dec. [9][10][11]2008 WeA13.3 be written from (5)-(6) as…”
Section: Problem Formulationmentioning
confidence: 99%
See 1 more Smart Citation
“…Applications appear in various areas, including computer vision [MKS01], machine learning [NA05], maximum likelihood estimation [Smi05], [XB05], electronic structure computation [LE00], system balancing [HM94], model reduction [YL99], and robot manipulation [HHM02].…”
Section: Introductionmentioning
confidence: 99%
“…Applications appear in various areas, including computer vision [31], machine learning [34], maximum likelihood estimation [45,56], electronic structure computation [30], system balancing [19], model reduction [57], and robot manipulation [18]. A lot of algorithms such as steepest descent method, trust-region method, conjugate gradient method and so on have been extended to solve optimization problems on Riemannian manifolds (see, e.g., [1,23,46,47] and the references therein).…”
Section: Introductionmentioning
confidence: 99%