2022
DOI: 10.1109/lra.2022.3177846
|View full text |Cite
|
Sign up to set email alerts
|

RO-LOAM: 3D Reference Object-based Trajectory and Map Optimization in LiDAR Odometry and Mapping

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 25 publications
0
1
0
Order By: Relevance
“…Nonetheless, the method is not restricted to Manhattanworld environments with enclosed rooms, like in [22], nor does it require manual input of the robot's initial position in the map like the one proposed in [19].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Nonetheless, the method is not restricted to Manhattanworld environments with enclosed rooms, like in [22], nor does it require manual input of the robot's initial position in the map like the one proposed in [19].…”
Section: Discussionmentioning
confidence: 99%
“…[18] propose Reference-LOAM (R-LOAM), a method that leverages a joint optimization incorporating point and mesh features for 6 degrees of freedom (DoF) Unmanned Aerial Vehicle (UAV) localization. Subsequently, in [19], they improved their method with pose-graph optimization to reduce drift even when the reference object is not visible.…”
Section: Bim-based 3d Lidar Localization and Mappingmentioning
confidence: 99%
“…It leverages planar features from ground points and edge features from segmented points to incrementally determine a 6 degree-of-freedom (DOF) transformation. R-LOAM [101] and RO-LOAM [102] optimize the robot's trajectory by incorporating mesh features derived from the 3D triangular mesh of a reference object with a known global coordinate location. Plane features, prevalent in everyday environments, have garnered significant attention as they can be easily extracted from the LiDAR point cloud.…”
Section: Feature-based Matchingmentioning
confidence: 99%