2021 20th International Conference on Advanced Robotics (ICAR) 2021
DOI: 10.1109/icar53236.2021.9659474
|View full text |Cite
|
Sign up to set email alerts
|

ICP Localization and Walking Experiments on a TALOS Humanoid Robot

Abstract: This system paper describes the integration and the evaluation of an ICP-based localization system on the TALOS humanoid robot. The new generation of flash LiDAR systems, here an Ouster OS1-64, have made it possible to obtain 3D clouds at 10 Hz. Coupled with an Intel RealSense T265 providing visual-inertial odometry it is possible to localize the robot and use this information to generate foot steps in real time to reach specific points. The approach is validated with a Qualisys motion capture system. It is al… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 15 publications
0
5
0
Order By: Relevance
“…Fig. 2: Intensity image reconstructed, the data is from an experiment of [1], it has been divided in two parts for visibility purposes.…”
Section: Intensity-based Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…Fig. 2: Intensity image reconstructed, the data is from an experiment of [1], it has been divided in two parts for visibility purposes.…”
Section: Intensity-based Methodsmentioning
confidence: 99%
“…A case is considered successful if the error in translation is below 10cm and the error in rotation is below 5 • . These thresholds have been fixed relatively to the problematic: initializing the localization system presented in [1]. We experimentaly determined that the localization mechanism needs an initialization with 50cm of translation tolerance and 30 • of rotation in the ⃗ z-axis.…”
Section: B Ransac Coarse Estimationmentioning
confidence: 99%
See 2 more Smart Citations
“…Very often, MoCap is used as precise ground truth to determine the position and orientation of the floating base of a humanoid robot. For example, MoCap has been used to evaluate different state estimation approaches based on proprioceptive sensors [5], LiDAR and kinematic-inertial data fusion [6], as well as LiDAR fused with visual-inertial odometry [7]. Less frequently, external motion capture systems have been used to provide state feedback to the control loops of legged robots.…”
Section: Introductionmentioning
confidence: 99%