2016
DOI: 10.1177/1687814016640996
|View full text |Cite
|
Sign up to set email alerts
|

Computationally efficient visual–inertial sensor fusion for Global Positioning System–denied navigation on a small quadrotor

Abstract: Because of the complementary nature of visual and inertial sensors, the combination of both is able to provide fast and accurate 6 degree-of-freedom state estimation, which is the fundamental requirement for robotic (especially, unmanned aerial vehicle) navigation tasks in Global Positioning System-denied environments. This article presents a computationally efficient visual-inertial fusion algorithm, by separating orientation fusion from the position fusion process. The algorithm is designed to perform 6 degr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2017
2017
2019
2019

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 20 publications
0
2
0
Order By: Relevance
“…Although [ 27 ] proposed a modified linear Kalman filter to perform the fusion of inertial and visual data, the accurate orientation estimates were based on the assumption of gyroscope measurements trusted for up to several minutes. In [ 28 ], the authors proposed a novel fusion algorithm by separating the orientation fusion and the position fusion process, while the orientation estimation could only be robust for a static or slow movement without magnetic distortions using the method proposed in [ 29 ]. In contrast, in this paper, the orientation is firstly estimated by our previously proposed orientation filter in [ 2 ] only from inertial measurements.…”
Section: Related Workmentioning
confidence: 99%
“…Although [ 27 ] proposed a modified linear Kalman filter to perform the fusion of inertial and visual data, the accurate orientation estimates were based on the assumption of gyroscope measurements trusted for up to several minutes. In [ 28 ], the authors proposed a novel fusion algorithm by separating the orientation fusion and the position fusion process, while the orientation estimation could only be robust for a static or slow movement without magnetic distortions using the method proposed in [ 29 ]. In contrast, in this paper, the orientation is firstly estimated by our previously proposed orientation filter in [ 2 ] only from inertial measurements.…”
Section: Related Workmentioning
confidence: 99%
“…Micro cameras have been proposed for MAV applications (e.g., a PAL (PAL stands for Phase Alternating Line) -camera with 720 × 576 pixels at 25 fps [16] or a CMOS (CMOS stands for Complementary Metal-Oxide Semiconductor) camera with 752 × 480 pixels at 80 fps [19]), including visual-based simultaneous localization and mapping algorithms (SLAM).…”
Section: Introductionmentioning
confidence: 99%