2019 International Conference on Robotics and Automation (ICRA) 2019
DOI: 10.1109/icra.2019.8794023
|View full text |Cite
|
Sign up to set email alerts
|

Real Time Dense Depth Estimation by Fusing Stereo with Sparse Depth Measurements

Abstract: We present an approach to depth estimation that fuses information from a stereo pair with sparse range measurements derived from a LIDAR sensor or a range camera. The goal of this work is to exploit the complementary strengths of the two sensor modalities, the accurate but sparse range measurements and the ambiguous but dense stereo information. These two sources are effectively and efficiently fused by combining ideas from anisotropic diffusion and semi-global matching.We evaluate our approach on the KITTI 20… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(11 citation statements)
references
References 22 publications
0
11
0
Order By: Relevance
“…Thanks to the complementary characteristic across different sensors, several works [1] [2] have studied how to fuse multiple modalities in order to provide more accurate and denser depth estimation. In this paper, we consider the fusion of passive stereo camera and active 3D LiDAR sensor, which is a practical and popular choice.…”
Section: Introductionmentioning
confidence: 99%
“…Thanks to the complementary characteristic across different sensors, several works [1] [2] have studied how to fuse multiple modalities in order to provide more accurate and denser depth estimation. In this paper, we consider the fusion of passive stereo camera and active 3D LiDAR sensor, which is a practical and popular choice.…”
Section: Introductionmentioning
confidence: 99%
“…The Gaussian enhancement function was used to optimize the cost volume on each LiDAR projection point, which improved the accuracy of dense matching (Poggi et al, 2019). Furthermore, expanding the LiDAR projection points yielded much improved dense matching results by predicting the disparities of the neighboring pixels around the LiDAR projection points (Shivakumar et al, 2019); however, its matching accuracy relies on the accuracy of the predicted disparities.…”
Section: Dense Image Matching Constrained By Lidar Datamentioning
confidence: 99%
“…Expanding the number of LiDAR projection points by predicting the disparities of the neighboring pixels around the LiDAR projection points can overcome the above problems (Shivakumar et al, 2019). But this approach also brings a new challenge that the dense matching results rely on the accuracy of the predicted disparities.…”
Section: Introductionmentioning
confidence: 99%
“…For example, measurements from the IMU can be aligned with image feedback to recover metric information like Qin et al (2018). The vision system can also be combined with LiDAR sensors, and sparse laser points help to achieve dense perception (Shivakumar et al 2019).…”
Section: Multiple Modules Integrationmentioning
confidence: 99%