2022
DOI: 10.1145/3517260
|View full text |Cite
|
Sign up to set email alerts
|

InDepth

Abstract: Mobile Augmented Reality (AR) demands realistic rendering of virtual content that seamlessly blends into the physical environment. For this reason, AR headsets and recent smartphones are increasingly equipped with Time-of-Flight (ToF) cameras to acquire depth maps of a scene in real-time. ToF cameras are cheap and fast, however, they suffer from several issues that affect the quality of depth data, ultimately hampering their use for mobile AR. Among them, scale errors of virtual objects - appearing much bigger… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 13 publications
(2 citation statements)
references
References 66 publications
(129 reference statements)
0
2
0
Order By: Relevance
“…In our evaluation of depth estimates obtained by a Microsoft HoloLens 2 (in the long throw mode) across a range of representative indoor environments, we found that on average 30% of depth pixels in a frame were missing [220]. We also collected an indoor dataset of 18.6K depth maps on a Samsung Galaxy Note 10+ smartphone, of which 58% had greater than 40% missing pixels [220].…”
Section: Collaborative Depth Mappingmentioning
confidence: 99%
See 1 more Smart Citation
“…In our evaluation of depth estimates obtained by a Microsoft HoloLens 2 (in the long throw mode) across a range of representative indoor environments, we found that on average 30% of depth pixels in a frame were missing [220]. We also collected an indoor dataset of 18.6K depth maps on a Samsung Galaxy Note 10+ smartphone, of which 58% had greater than 40% missing pixels [220].…”
Section: Collaborative Depth Mappingmentioning
confidence: 99%
“…Even some challenging reflections may be avoided from different viewpoints. We envision this collaborative sensing approach being combined with existing techniques for depth map completion such as [126,160,219,220], with the more complete depth data from multiple sensors combined on the edge server for a less challenging depth inpainting task.…”
Section: Collaborative Depth Mappingmentioning
confidence: 99%