Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006.
DOI: 10.1109/robot.2006.1642132
|View full text |Cite
|
Sign up to set email alerts
|

Omnidirectional vision on UAV for attitude computation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
37
0

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 37 publications
(38 citation statements)
references
References 13 publications
1
37
0
Order By: Relevance
“…They proposed to equip a Micro Air Vehicle (MAV) with a perspective camera to have a vision-guided flight stability and autonomy system. Omnidirectional sensors for attitude estimation were first introduced by [2]. The omnidirectional sensors (Fisheye and Catadioptric cameras shown in figure (2)) were used in different scenarios.…”
Section: Vision Sensors For Attitude Estimationmentioning
confidence: 99%
See 2 more Smart Citations
“…They proposed to equip a Micro Air Vehicle (MAV) with a perspective camera to have a vision-guided flight stability and autonomy system. Omnidirectional sensors for attitude estimation were first introduced by [2]. The omnidirectional sensors (Fisheye and Catadioptric cameras shown in figure (2)) were used in different scenarios.…”
Section: Vision Sensors For Attitude Estimationmentioning
confidence: 99%
“…Using omnidirectional vision, some algorithms use markovian formulation of sky/ground segmentation based on color information [2], or the sky/ground partitioning is done in the spherical image thanks to the optimization of the Mahalanobis distance between these regions. The search for points in either regions takes place in the RGB space [11].…”
Section: Sky/ground Segmentationmentioning
confidence: 99%
See 1 more Smart Citation
“…Ying and Hu (2004) demonstrate that the occluding contour of a sphere in space is projected onto a circle on the unit sphere or onto a conic in the catadioptric image plane. Considering the skyline the occluding contour on the earth sphere surface, finding it requires to look for a small circle on the unitary sphere model or for a conic or ellipse on the image plane, as proposed by Demonceaux et al (2006) (see Fig. 3 for an illustration).…”
Section: Skyline and Catadioptric Imagementioning
confidence: 99%
“…Conroy et al (2009) use spatial decompositions of the instantaneous optic flow to extract local proximity information from catadioptric images obtained onboard a microair-vehicle (MAV) for corridor-like environment navigation. Demonceaux et al (2006), use a similar approach to the one presented in this paper, showing the advantages of using omnidirectional rather than perspective images for attitude estimation. They detect the horizon line on the catadioptric image using a Markov Random Fields segmentation and then project it on the equivalent sphere projection model for a Catadioptric system (Geyer and Daniilidis 2000) to obtain the attitude angles of the camera frame, which are related to the normal vector of the projected horizon line.…”
Section: Introductionmentioning
confidence: 96%