2012 IEEE International Conference on Robotics and Automation 2012
DOI: 10.1109/icra.2012.6224983
|View full text |Cite
|
Sign up to set email alerts
|

Relative navigation and control of a hexacopter

Abstract: This paper discusses the progress made on developing a multi-rotor helicopter equipped with a vision-based ability to navigate through an a priori unknown, GPS-denied environment. We highlight the backbone of our system, the relative estimation and control. We depart from the common practice of using a globally referenced map, preferring instead to keep the position and yaw states in the EKF relative to the current map node. This relative navigation approach allows simple application of sensor updates, natural… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 20 publications
(13 citation statements)
references
References 14 publications
0
13
0
Order By: Relevance
“…The main contribution of this article is the development of a multiplicative extended Kalman filter (MEKF) that uses an improved rotorcraft model [14] to provide relative, rather than global, state estimates. Another contribution is a more detailed development of the relative navigation approach originally described in [19] and the relative filtering approach described in [18]. Additionally, we verify the results of the relative estimation approach with hardware flight tests accompanied by comparisons to motion capture truth data.…”
Section: Introductionmentioning
confidence: 89%
See 1 more Smart Citation
“…The main contribution of this article is the development of a multiplicative extended Kalman filter (MEKF) that uses an improved rotorcraft model [14] to provide relative, rather than global, state estimates. Another contribution is a more detailed development of the relative navigation approach originally described in [19] and the relative filtering approach described in [18]. Additionally, we verify the results of the relative estimation approach with hardware flight tests accompanied by comparisons to motion capture truth data.…”
Section: Introductionmentioning
confidence: 89%
“…In [18], [19], the authors propose that a vehicle should navigate using a relative formulation of the vehicle state, rather than a global one. Similar to the general approach, they use a combination of graph SLAM and an EKF to provide mapping and sensor fusion.…”
Section: Introductionmentioning
confidence: 99%
“…In [20] and [21], the authors propose that a vehicle should navigate using a relative formulation of the vehicle state, rather than a global one. A combination of graph SLAM and an EKF is used to provide mapping and sensor fusion.…”
Section: Iiib Relative Navigation Approachmentioning
confidence: 99%
“…The resulting scheme utilizes IMU state and covariance propagation information to aid the feature matching of the stereo vision, leading to increased efficiency and robustness of MAV state estimation in complex industrial environments. In addition, EKF based methods have also proven useful for RGB-D vision-aided navigation of MAVs, and relevant examples can be found in [17,18]. Among the variants of the EKFs, multiplicative extended Kalman filters (MEKF) are especially useful for MAV attitude estimation applications.…”
Section: Related Workmentioning
confidence: 99%
“…RGB-D devices are based on structured light technologies and can provide depth data even in poorly textured environments. Taking advantage of RGB-D devices, many researchers have achieved successful results in the field of indoor MAV navigation, such as state estimation, control and indoor mapping [17,18]. Despite these advances achieved in this domain, there is still significant progress to be made in developing more robust and computationally efficient visual odometry approaches for MAVs in complex environments, using lightweight and low-cost RGB-D devices and MEMS sensors.…”
Section: Introductionmentioning
confidence: 99%