2016
DOI: 10.1109/tcsvt.2015.2452781
|View full text |Cite
|
Sign up to set email alerts
|

HW/SW Codesign and FPGA Acceleration of Visual Odometry Algorithms for Rover Navigation on Mars

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 43 publications
(25 citation statements)
references
References 34 publications
0
24
0
Order By: Relevance
“…Despite the fact that the scope of this review focuses on the ADR scenario, it is pointed out that other applications of vision-based navigation in space rely on analogous principles and employ similar visual primitives. Such applications, for example, include obstacle avoidance, mapping, and visual odometry for planetary rovers [17,18] or approach velocity estimation and localization for precision landing [18][19][20]. As a result, the choice of a processing platform suitable for ADR is also highly relevant to visual navigation applications for planetary exploration rovers and landers.…”
Section: Algorithms For Vision-based Navigation In Orbitmentioning
confidence: 99%
See 2 more Smart Citations
“…Despite the fact that the scope of this review focuses on the ADR scenario, it is pointed out that other applications of vision-based navigation in space rely on analogous principles and employ similar visual primitives. Such applications, for example, include obstacle avoidance, mapping, and visual odometry for planetary rovers [17,18] or approach velocity estimation and localization for precision landing [18][19][20]. As a result, the choice of a processing platform suitable for ADR is also highly relevant to visual navigation applications for planetary exploration rovers and landers.…”
Section: Algorithms For Vision-based Navigation In Orbitmentioning
confidence: 99%
“…As a result, the choice of a processing platform suitable for ADR is also highly relevant to visual navigation applications for planetary exploration rovers and landers. Figure 2 depicts some indicative applications of VBN for rovers (top left) and ADR (bottom right); an image acquired by a Mars rover (top left) is processed by 3-D reconstruction (shown to its right, "mapping" via stereo matching) and visual odometry estimation (shown later, "localization" via feature detection and matching between successive images, annotated with circles and lines, respectively) [17], whereas edges detected on a satellite and matched against its geometric model (bottom right, red outline) serve as features for estimating the satellite's pose relative to the camera [21].…”
Section: Algorithms For Vision-based Navigation In Orbitmentioning
confidence: 99%
See 1 more Smart Citation
“…However, this approach requires an RGB-D camera, which is unavailable and not preferred on many low-power robot agents, such as drones and planetary rovers. Lentaris et al [16] proposed the utilization of an space-worthy Field Programmable Gate Array (FPGA) co-processor to accelerate visual odometry to one order of magnitude faster than the software utilized by contemporary Mars rovers. This approach requires a FPGA co-processor to be added to the robot itself, which increases its weight and cost.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Accurately localizing moving cameras from monocular videos [63] is capable of providing scene context for high-level video understanding tasks [30], e.g., behavior analysis, action recognition, and so on, and has wide potentials in many intelligent systems, e.g., robotics, autonomous vehicles, and intelligent helicopters. While GPS devices are popular in these applications, they can only provide camera positions of moderate accuracies [55] and are sensitive to environment changes.…”
Section: Introductionmentioning
confidence: 99%