2023
DOI: 10.1002/rob.22151
|View full text |Cite
|
Sign up to set email alerts
|

A benchmark analysis of data‐driven and geometric approaches for robot ego‐motion estimation

Abstract: In the last decades, ego-motion estimation or visual odometry (VO) has received a considerable amount of attention from the robotic research community, mainly due to its central importance in achieving robust localization and, as a consequence, autonomy. Different solutions have been explored, leading to a wide variety of approaches, mostly grounded on geometric methodologies and, more recently, on data-driven paradigms. To guide researchers and practitioners in choosing the best VO method, different benchmark… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 61 publications
(119 reference statements)
0
2
0
Order By: Relevance
“…This section discusses the results obtained comparing different pose estimation algorithms. For this purpose, we selected a set of approaches among the most popular for both vision-based (see Legittimo et al, 2023) and LIDAR-based (see Huang et al, 2022;Jonnavithula et al, 2021) pose estimation. In particular, we considered:…”
Section: Robot Pose Estimationmentioning
confidence: 99%
See 1 more Smart Citation
“…This section discusses the results obtained comparing different pose estimation algorithms. For this purpose, we selected a set of approaches among the most popular for both vision-based (see Legittimo et al, 2023) and LIDAR-based (see Huang et al, 2022;Jonnavithula et al, 2021) pose estimation. In particular, we considered:…”
Section: Robot Pose Estimationmentioning
confidence: 99%
“…This section discusses the results obtained comparing different pose estimation algorithms. For this purpose, we selected a set of approaches among the most popular for both vision‐based (see Legittimo et al, 2023) and LIDAR‐based (see Huang et al, 2022; Jonnavithula et al, 2021) pose estimation. In particular, we considered: Direct Sparse Odometry (DSO) (Engel et al, 2017) and ORB‐SLAM2 Mono (Mur‐Artal & Tardós, 2017) for monocular camera setups; VINS‐Mono (Qin et al, 2018) for monocular‐inertial setups; ORB‐SLAM2 Stereo (Mur‐Artal & Tardós, 2017) for stereo‐camera setups; Open‐VINS (Geneva et al, 2020) for stereo‐inertial setups; FLOAM (Wang et al, 2020) and LeGO‐LOAM (Shan & Englot, 2018) for LIDAR‐based setups.We performed comparative experiments on a selection of eight sequences, considering two sequences for each cultivation field.…”
Section: Applicationsmentioning
confidence: 99%