AIAA SCITECH 2022 Forum 2022
DOI: 10.2514/6.2022-1214
|View full text |Cite
|
Sign up to set email alerts
|

Mars 2020 Lander Vision System Flight Performance

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
8
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(10 citation statements)
references
References 8 publications
0
8
0
Order By: Relevance
“…Recently, Downes et al [9] present a deep learning method for lunar crater detection to improve TRN landmark tracking. The Lander Vision System (LVS) [10] used for the Mars 2020 mission uses vision-based landmark matching starting at an altitude of 4200m above the martian surface with the objective of achieving less than 40m error with respect to the landing site. Our analysis focuses on higher altitudes and on a larger span on altitudes (4.5 km to 33 km for the balloon dataset).…”
Section: Related Workmentioning
confidence: 99%
“…Recently, Downes et al [9] present a deep learning method for lunar crater detection to improve TRN landmark tracking. The Lander Vision System (LVS) [10] used for the Mars 2020 mission uses vision-based landmark matching starting at an altitude of 4200m above the martian surface with the objective of achieving less than 40m error with respect to the landing site. Our analysis focuses on higher altitudes and on a larger span on altitudes (4.5 km to 33 km for the balloon dataset).…”
Section: Related Workmentioning
confidence: 99%
“…However, given weight and size constraints, range sensors are not suitable for a weight restricted UAV. The Mars 2020 mission deploys LVS, a lander vision system to detect landing hazards [14]. Given an on-board map with predetermined hazard locations, LVS uses a monocular camera to estimate the spacecraft's position during descent and triggers an avoidance maneuver if necessary.…”
Section: Related Workmentioning
confidence: 99%
“…Recent decades have witnessed great technological advancement and expanding acceptance of autonomous vision-based navigation for the exploration of other celestial bodies (e.g., Moon, Mars, asteroids). These advancements in real-time, on-board OPNAV are exemplified by the technological progression from simple estimation of lander velocity with the Mars Exploration Rover's DIMES system in 2004 [18] to autonomous feature tracking that will soon be demonstrated on the OSIRIS-REx mission to asteroid Bennu [101] and during landing of the Mars Perseverance (Mars 2020) rover [61].…”
Section: Introductionmentioning
confidence: 99%
“…Many landmark-based OPNAV algorithms rely on the real-time rendering of an onboard digital elevation map (DEM) [101,3,41,61]. The standard approach is to render the expected appearance of landmark patches and then compare this to regions of the navigation image, usually by means of a 2D crosscorrelation.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation