2017
DOI: 10.1145/3072959.3073686
|View full text |Cite
|
Sign up to set email alerts
|

Epipolar time-of-flight imaging

Abstract: Consumer time-of-flight depth cameras like Kinect and PMD are cheap, compact and produce video-rate depth maps in short-range applications. In this paper we apply energy-efficient epipolar imaging to the ToF domain to significantly expand the versatility of these sensors: we demonstrate live 3D imaging at over 15 m range outdoors in bright sunlight; robustness to global transport effects such as specular and diffuse inter-reflections---the first live demonstration for this ToF technology; interference-free 3D … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
40
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 81 publications
(40 citation statements)
references
References 15 publications
(2 reference statements)
0
40
0
Order By: Relevance
“…The theoretical criteria derived here can be used in conjunction with these hardware architectures for optimal LiDAR design. Active 3D imaging in sunlight: Prior work in the structured light and time-of-flight literature proposes various coding and illumination schemes to address the problem of low signal-to-noise ratios (SNR) due to strong ambient light [19,12,23,1]. The present work deals with a different problem of optimal photon detection for SPAD-based pulsed time-of-flight.…”
Section: Related Workmentioning
confidence: 99%
“…The theoretical criteria derived here can be used in conjunction with these hardware architectures for optimal LiDAR design. Active 3D imaging in sunlight: Prior work in the structured light and time-of-flight literature proposes various coding and illumination schemes to address the problem of low signal-to-noise ratios (SNR) due to strong ambient light [19,12,23,1]. The present work deals with a different problem of optimal photon detection for SPAD-based pulsed time-of-flight.…”
Section: Related Workmentioning
confidence: 99%
“…Synchronization with camera exposure has allowed for reconstruction in the face of strong ambient light [1]. Transient imaging is possible using ultra-fast lasers [42], and has recently been demonstrated using mobile off-the-shelf devices [14].…”
Section: Related Workmentioning
confidence: 99%
“…Recent work has addressed some aspects of TOF energy efficiency with novel illumination encodings. For example, by synchronizing illumination patterns to match sensor exposures [1], low-power reconstruction can occur for scenes with significant ambient light. Additionally, spatio-temporal encodings have been shown to be efficient for both structured light illumination [30] and TOF illumination as well [29].…”
mentioning
confidence: 99%
“…Active depth cameras, such as scanning lidar systems, have not only become a cornerstone imaging modality for autonomous driving and robotics, but are emerging in applications across disciplines, including autonomous drones, remote sensing, human-computer interaction, and augmented or virtual reality. Depth cameras that provide dense range allow for dense scene reconstructions [26] when combined with color cameras, including correlation time-offlight cameras (C-ToF) [19,30,33] such as Microsoft's Kinect One, or structured light cameras [1,42,43,49]. These acquisition systems facilitate the collection of largescale RGB-D data sets that fuel research on core computer vision problems, including scene understanding [23,53] and action recognition [40].…”
Section: Introductionmentioning
confidence: 99%