2019
DOI: 10.3390/s19020408
|View full text |Cite
|
Sign up to set email alerts
|

Infrared-Inertial Navigation for Commercial Aircraft Precision Landing in Low Visibility and GPS-Denied Environments

Abstract: This paper proposes a novel infrared-inertial navigation method for the precise landing of commercial aircraft in low visibility and Global Position System (GPS)-denied environments. Within a Square-root Unscented Kalman Filter (SR_UKF), inertial measurement unit (IMU) data, forward-looking infrared (FLIR) images and airport geo-information are integrated to estimate the position, velocity and attitude of the aircraft during landing. Homography between the synthetic image and the real image which implicates th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
16
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 19 publications
(16 citation statements)
references
References 46 publications
0
16
0
Order By: Relevance
“…The use of an active sensor on those scenarios can tackle some of those limitations and provide alternatives for autonomous navigation, as they have in other domains of study [31]. Infrared-inertial precision landing [32] and thermal-inertial localization [33,34] are examples of active sensor solutions utilizing infrared wavelengths. Additionally, advancements concerning navigation, mapping and geolocalization have been made with Light Detection and Ranging (LiDAR) [35] and Synthetic Aperture Radar (SAR) sensors [36,37].…”
Section: Introductionmentioning
confidence: 99%
“…The use of an active sensor on those scenarios can tackle some of those limitations and provide alternatives for autonomous navigation, as they have in other domains of study [31]. Infrared-inertial precision landing [32] and thermal-inertial localization [33,34] are examples of active sensor solutions utilizing infrared wavelengths. Additionally, advancements concerning navigation, mapping and geolocalization have been made with Light Detection and Ranging (LiDAR) [35] and Synthetic Aperture Radar (SAR) sensors [36,37].…”
Section: Introductionmentioning
confidence: 99%
“…Vision-based navigation in general has three main components: sensor type (ultraviolet [ 10 ], infrared [ 11 , 12 ], visible RGB/mono [ 11 , 13 ], stereo [ 14 ]), a priori data (terrain model [ 10 ], satellite images [ 15 ], UAV images [ 16 ], 3D position of key features [ 11 , 13 , 14 ]), and data accumulation (single frame [ 11 , 13 , 14 ] and optic flow/SLAM [ 6 , 16 , 17 , 18 , 19 , 20 ]). Runway relative navigation for autonomous landing is a special case.…”
Section: Introductionmentioning
confidence: 99%
“…Binary images of a complete polygon model of the runway with taxiway exits is generated in [ 21 ] from a neighborhood of the IMU/GPS pose, and these images are compared with the shifted hue channel of the HSV version of an RGB camera image to get the best fit. From IMU pose and runway geoinformation, four edges are rendered in [ 12 ] and line features are fitted in the real image which defined the homography from the synthetic image to the real image. Four corner points of the runway (parallel lines with four points) are also enough for 6D pose calculations.…”
Section: Introductionmentioning
confidence: 99%
“…The landing trajectory is calculated and sent to the UAV via radio to perform Command and Control (C2). A vision system is more reliable in environments where we can have a Global Positioning System (GPS) or radar jamming (Acuna, Zhang, & Willert, 2018; L. Zhang, Zhai, He, Wen, & Niu, 2019).…”
Section: Introductionmentioning
confidence: 99%