The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2013
DOI: 10.1007/978-3-642-36279-8_34
|View full text |Cite
|
Sign up to set email alerts
|

Towards Consistent Vision-Aided Inertial Navigation

Abstract: In this paper, we study estimator inconsistency in Vision-aided Inertial Navigation Systems (VINS) from a standpoint of system observability. We postulate that a leading cause of inconsistency is the gain of spurious information along unobservable directions, resulting in smaller uncertainties, larger estimation errors, and possibly even divergence. We develop an Observability-Constrained VINS (OC-VINS), which explicitly enforces the unobservable directions of the system, hence preventing spurious information … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
88
0
1

Year Published

2013
2013
2017
2017

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 59 publications
(93 citation statements)
references
References 31 publications
1
88
0
1
Order By: Relevance
“…The optimal A * , as shown in [9], can be determined by solving its KKT optimality condition [3], whose solution is:…”
Section: Observability-constrained Ekfmentioning
confidence: 99%
See 1 more Smart Citation
“…The optimal A * , as shown in [9], can be determined by solving its KKT optimality condition [3], whose solution is:…”
Section: Observability-constrained Ekfmentioning
confidence: 99%
“…position and orientation (pose) of a sensing platform within GPS-denied environments, vision-aided inertial navigation is one of the most established, primarily due to its high precision and low cost. During the past decade, VINS have been successfully applied to spacecraft [20], automotive [17], and personal localization [9], demonstrating realtime performance.…”
Section: Introduction and Related Workmentioning
confidence: 99%
“…In this system, we perform the observability analysis and show that while the key results of the previous observability analyses (e.g., [8,13,15,16]) are valid (the robot's global position and its orientation around the normal of the plane are unobservable), by constraining visual observations to be on a horizontal plane, the orthogonal translation of the camera with respect to the plane becomes observable. More specifically, we prove that by observing unknown feature points on a horizontal plane, the navigation system has only three unobservable directions corresponding to the global translations parallel to the plane, and the rotation around the gravity vector.…”
Section: Introductionmentioning
confidence: 99%
“…The rich representation of a scene captured in an image, together with the accurate short-term estimates by gyroscopes and accelerometers present in a typical IMU have been acknowledged to complement each other, with great uses in airborne [6,20] and automotive [14] navigation. Moreover, with the availability of these sensors in most smart phones, there is great interest and research activity in effective solutions to visual-inertial SLAM.…”
Section: Introductionmentioning
confidence: 99%