2018
DOI: 10.1177/1550147718800655
|View full text |Cite
|
Sign up to set email alerts
|

A hierarchical vision-based localization of rotor unmanned aerial vehicles for autonomous landing

Abstract: The vision-based localization of rotor unmanned aerial vehicles for autonomous landing is challenging because of the limited detection range. In this article, to extend the vision detection and measurement range, a hierarchical vision-based localization method is proposed for unmanned aerial vehicle autonomous landing. In such a hierarchical framework, the landing is defined into three phases: “Approaching,”“Adjustment,” and “Touchdown,” in which visual artificial features at different scales can be detected f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 34 publications
(34 reference statements)
0
3
0
Order By: Relevance
“…A plethora of different autonomous landing approaches is discussed in [17]. The existing solutions pertaining to outdoor landing, which we will focus upon, can mainly be classified into "known" [18][19][20][21] and "unknown" environments [22]. The landing task can be broken down into two main phases: (1) finding an appropriate landing spot and (2) executing the landing maneuver.…”
Section: Related Literaturementioning
confidence: 99%
“…A plethora of different autonomous landing approaches is discussed in [17]. The existing solutions pertaining to outdoor landing, which we will focus upon, can mainly be classified into "known" [18][19][20][21] and "unknown" environments [22]. The landing task can be broken down into two main phases: (1) finding an appropriate landing spot and (2) executing the landing maneuver.…”
Section: Related Literaturementioning
confidence: 99%
“…Vision-based autonomous landing has used landing markers in an "H"-shape pattern; landing markers inspired by QR code [2]; ArUco markers [3], AprilTag markers [4]; and special-pattern black and white markers [5] and color markers [6]. An onboard implementation of the computer vision algorithms to detect a moving platform has been demonstrated using CPU [7] and GPU [8].…”
Section: State Of the Artmentioning
confidence: 99%
“…Then, RANSAC is used to stitch the fine line segments to detect the whole horizon line. To solve this problem, some researchers attempt to utilize the unmanned aerial vehicle (UAV) to enhance the visual perception ability of the USV [25,26], which increases the complexity of the unmanned system.…”
Section: Related Workmentioning
confidence: 99%