2017 17th International Conference on Control, Automation and Systems (ICCAS) 2017
DOI: 10.23919/iccas.2017.8204466
|View full text |Cite
|
Sign up to set email alerts
|

Vision based autonomous landing of an Unmanned Aerial Vehicle on a stationary target

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 25 publications
(9 citation statements)
references
References 10 publications
0
9
0
Order By: Relevance
“…First, the following experiment is used to verify the IAEC algorithm. Five desired waypoints, [2,1,2], [3,5,3], [5,2,5], [10,8,4], and [12,2,5], are given in this experiment. In addition, the velocities and accelerations at the start and end points are zero.…”
Section: Landmarksmentioning
confidence: 99%
See 1 more Smart Citation
“…First, the following experiment is used to verify the IAEC algorithm. Five desired waypoints, [2,1,2], [3,5,3], [5,2,5], [10,8,4], and [12,2,5], are given in this experiment. In addition, the velocities and accelerations at the start and end points are zero.…”
Section: Landmarksmentioning
confidence: 99%
“…Generally, the 2D landmark is arranged on the top of the moving platform, so its size is often strictly limited. To provide a large range of localization data under this limitation, researchers have adopted landmarks such as 2D codes, ring landmarks, or character landmarks [8][9][10]. Although interesting results have been achieved, they are not necessarily applicable to dynamically moving targets in an open outdoor environment…”
Section: Introductionmentioning
confidence: 99%
“…The relative pose between the current and the previous frames could be tracked by observing a structured-unknown object. [8][9][10] Based on speeded-up robust features (SURF) feature descriptors and fast approximate nearest neighbor search (FLANN) matcher, a template matching method 11 was presented to determine the relative position of the landing target. Also, artificial neural networks (ANN) have been employed to estimate the state of the landing UAV in Moriarty et al 12 Similar to our work, Araar et al 13 have designed an adequate pad to extend the detection range.…”
Section: Previous Workmentioning
confidence: 99%
“…Ye et al [24] proposed a novel field scene recognition algorithm based on the combination of local pyramid feature and convolutional neural network (CNN) learning feature to identify the complex and autonomous landing scene for the low-small-slow UAV. For identifying the autonomous landing of an UAV on a stationary target, Sudevan et al [25] used the speeded up robust features (SURF) method to detect and compute the key-point descriptors. The fast-approximate nearest neighbor search library (FLANN) was utilized to achieve the accurate matching of the template image with the captured images to determine the target position.…”
Section: Introductionmentioning
confidence: 99%