2011 IEEE/RSJ International Conference on Intelligent Robots and Systems 2011
DOI: 10.1109/iros.2011.6048855
|View full text |Cite
|
Sign up to set email alerts
|

Robust embedded egomotion estimation

Abstract: Abstract-This work presents a method for estimating the egomotion of an aerial vehicle in challenging industrial environments. It combines binocular visual and inertial cues in a tightly-coupled fashion and operates in real time on an embedded platform. An extended Kalman filter fuses measurements and makes motion estimation rely more on inertial data if visual feature constellation is degenerate. Errors in roll and pitch are bounded implicitly by the gravity vector. Inertial sensors are used for efficient out… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
19
0

Year Published

2012
2012
2018
2018

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(19 citation statements)
references
References 11 publications
0
19
0
Order By: Relevance
“…The rover's wheel encoders and an inertial navigation system are fused in an extended information filter. An EKF presented by Voigt et al [34] fuses binocular visual measurements and inertial cues for egomotion estimation of an aerial vehicle. Their approach relies more on inertial data if visual feature constellation is degenerate and enables pose estimation at frame rate on their platform.…”
Section: Egomotionmentioning
confidence: 99%
“…The rover's wheel encoders and an inertial navigation system are fused in an extended information filter. An EKF presented by Voigt et al [34] fuses binocular visual measurements and inertial cues for egomotion estimation of an aerial vehicle. Their approach relies more on inertial data if visual feature constellation is degenerate and enables pose estimation at frame rate on their platform.…”
Section: Egomotionmentioning
confidence: 99%
“…This may be done by using, e.g., SURF ( [16], [8]), FAST ( [20], [5], [8]) or Harris corners ( [13], [5], [10]). These features then serve as measurements for determining the translation and rotation from consecutive image frames by applying the epipolar constraint, using either the eightpoint algorithm [4], five-point algorithm [12], or one of their variants.…”
Section: A Related Workmentioning
confidence: 99%
“…These features then serve as measurements for determining the translation and rotation from consecutive image frames by applying the epipolar constraint, using either the eightpoint algorithm [4], five-point algorithm [12], or one of their variants. Good results can be achieved through combination with techniques like Kalman filtering [3] or robust estimation methods like RANSAC ( [10], [13], [20], [3]). …”
Section: A Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Voigt, et al [6] present a global estimation method that also uses an EKF to combine stereo visual odometry and IMU information. To make the relative visual odometry update global, additional variables that account for the position and orientation (pose) of the last image are added to the state.…”
Section: Introductionmentioning
confidence: 99%