2018
DOI: 10.3390/s18061948
|View full text |Cite
|
Sign up to set email alerts
|

Scale Estimation and Correction of the Monocular Simultaneous Localization and Mapping (SLAM) Based on Fusion of 1D Laser Range Finder and Vision Data

Abstract: This article presents a new sensor fusion method for visual simultaneous localization and mapping (SLAM) through integration of a monocular camera and a 1D-laser range finder. Such as a fusion method provides the scale estimation and drift correction and it is not limited by volume, e.g., the stereo camera is constrained by the baseline and overcomes the limited depth range problem associated with SLAM for RGBD cameras. We first present the analytical feasibility for estimating the absolute scale through the f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
31
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 43 publications
(31 citation statements)
references
References 41 publications
(44 reference statements)
0
31
0
Order By: Relevance
“…Zhang et al [70] proposed a monocular SLAM associated with a 1D laser range finder. As monocular SLAM often suffers from scale drift, this solution gave an efficient drift correction for a very low hardware cost.…”
Section: Improved Visual Slammentioning
confidence: 99%
“…Zhang et al [70] proposed a monocular SLAM associated with a 1D laser range finder. As monocular SLAM often suffers from scale drift, this solution gave an efficient drift correction for a very low hardware cost.…”
Section: Improved Visual Slammentioning
confidence: 99%
“…In recent years, based on the works of LiDAR-SLAM and Visual-SLAM, some researchers have started to carry out the research of integrating such two main sensors [21][22][23][24][25]. In [21], the authors applied a visual odometer to provide initial values for two-dimensional laser Iterative Closets Point (ICP) on a small UAV, and achieved good results in real-time and accuracy.…”
Section: Multi-sensor Fusionmentioning
confidence: 99%
“…[24] presents a localization method based in cooperation between aerial and ground robots in an indoor environment, a 2.5D elevation map is built by RGB-D sensor and 2D LiDAR attached on UAV. [25] provides a scale estimation and drift correction method by combining mono laser range finder and camera for mono-SLAM. In [26], a visual SLAM system that combines images acquired from a camera and sparse depth information obtained from 3D LiDAR is proposed, by using the direct method.…”
Section: Multi-sensor Fusionmentioning
confidence: 99%
“…When it comes to monocular SLAM, the absolute scale of the system is ambiguous [31]. Monocular SLAM can be fused with other facilities, including GNSS, IMU, and 1D laser range finder, to solve the scale ambiguity [32]- [34]. ORB-SLAM2 and VINSmono are representative algorithms of monocular SLAM [35], [36].…”
Section: Introductionmentioning
confidence: 99%