2019
DOI: 10.1016/j.inffus.2018.11.006
|View full text |Cite
|
Sign up to set email alerts
|

Stereo and ToF data fusion by learning from synthetic data

Abstract: Time-of-Flight (ToF) sensors and stereo vision systems are both capable of acquiring depth information but they have complementary characteristics and issues. A more accurate representation of the scene geometry can be obtained by fusing the two depth sources. In this paper we present a novel framework for data fusion where the contribution of the two depth sources is controlled by confidence measures that are jointly estimated using a Convolutional Neural Network. The two depth sources are fused enforcing the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
15
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(15 citation statements)
references
References 39 publications
0
15
0
Order By: Relevance
“…ToF and RGB Fusion. Existing data-driven approaches [1,2,36] heavily rely on synthetic data, creating a domain gap. This is exacerbated when using imperfect low-power sensors such on mobile phones.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…ToF and RGB Fusion. Existing data-driven approaches [1,2,36] heavily rely on synthetic data, creating a domain gap. This is exacerbated when using imperfect low-power sensors such on mobile phones.…”
Section: Related Workmentioning
confidence: 99%
“…In addition, current stereo-ToF fusion [14,10,15] typically estimates disparity from stereo and ToF separately before fusion. One approach is to estimate stereo and ToF confidence to merge the disparity maps [33,1,2,37]. In contrast, our ToF estimates are directly incorporated into our disparity pipeline before depth selection.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Light scattering errors [5], multipath errors [6], and object boundary ambiguity [7] have been identified as non-systematic errors. Compression may also result in distortion [8]. Figure 1 represents the problems in depth maps captured using ToF depth cameras.…”
Section: Introductionmentioning
confidence: 99%
“…Sensors 2020, 20,1156 2 of 17 systematic errors. Compression may also result in distortion [8]. Figure 1 represents the problems in depth maps captured using ToF depth cameras.…”
Section: Introductionmentioning
confidence: 99%