2022
DOI: 10.3390/s22010404
|View full text |Cite
|
Sign up to set email alerts
|

Proactive Guidance for Accurate UAV Landing on a Dynamic Platform: A Visual–Inertial Approach

Abstract: This work aimed to develop an autonomous system for unmanned aerial vehicles (UAVs) to land on moving platforms such as an automobile or a marine vessel, providing a promising solution for a long-endurance flight operation, a large mission coverage range, and a convenient recharging ground station. Unlike most state-of-the-art UAV landing frameworks that rely on UAV onboard computers and sensors, the proposed system fully depends on the computation unit situated on the ground vehicle/marine vessel to serve as … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(18 citation statements)
references
References 26 publications
(25 reference statements)
0
17
0
1
Order By: Relevance
“…This is one of the examples that highlights the advantage of our approach, where a common log polynomial speed controller function (albeit with different parameter values) for horizontal and vertical speed control provides explicit control over the speed profile characteristics resulting in a time-efficient landing. From an evaluation perspective, most approaches in literature reported the results in the form of aggregate landing accuracy from a limited number of trial runs [ 7 , 27 , 52 ]. We improve on the existing work through a more extensive evaluation.…”
Section: Discussionmentioning
confidence: 99%
“…This is one of the examples that highlights the advantage of our approach, where a common log polynomial speed controller function (albeit with different parameter values) for horizontal and vertical speed control provides explicit control over the speed profile characteristics resulting in a time-efficient landing. From an evaluation perspective, most approaches in literature reported the results in the form of aggregate landing accuracy from a limited number of trial runs [ 7 , 27 , 52 ]. We improve on the existing work through a more extensive evaluation.…”
Section: Discussionmentioning
confidence: 99%
“…Chang [16] aimed to develop an autonomous system for UAVs to land on moving platforms such as an automobile or a marine vessel, providing a promising solution for a long-endurance flight operation, a large mission coverage range, and a convenient recharging ground station. Their system fully depends on the computation unit situated on the ground vehicle/marine vessel to serve as a landing guidance system.…”
Section: A Existing Algorithmsmentioning
confidence: 99%
“…We will elaborate on these related technologies in the "Related work" section. In the relevant state-of-the-art literature that we have collected, most of them [3][4][5][6][7][8][9][10][11][12][13][14][15][16][17] studied the landing technology of rotorcraft UAVs, and only [1][2] investigated the technology related to the fixed-wing UAV landing. However, Kong [1] stated that his method relies on the ground-based stereo guidance system, and Zhang [2] focused on a method for high-precision altitude estimation.…”
Section: Introductionmentioning
confidence: 99%
“…Nevertheless, this technology is not always available or it is sometimes incapable of giving an acceptable level of accuracy, as can happen when the inside of a tank of a petrochemical plant is to be inspected. For this reason, in the literature, complementary landing assistance systems (LASs) are proposed based on computer vision techniques [ 9 , 10 , 11 , 12 , 13 , 14 , 15 , 16 , 17 , 18 , 19 ], a fusion between computer vision techniques and inertial measurement units (IMUs) [ 20 , 21 , 22 , 23 , 24 , 25 ], computer vision, IMU and ultrasonic sensors [ 26 ], computer vision and a Time-of-Flight-based height sensor [ 27 ], computer vision and GNSS [ 28 , 29 ] and even an approach fusing onboard cameras and a robotic total station [ 30 ]. The main setback of traditional vision-based systems is their strong dependency on weather or lighting conditions.…”
Section: Introductionmentioning
confidence: 99%