2014
DOI: 10.1016/j.jvcir.2013.02.011
|View full text |Cite
|
Sign up to set email alerts
|

Calibrated depth and color cameras for accurate 3D interaction in a stereoscopic augmented reality environment

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
26
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
5
4
1

Relationship

2
8

Authors

Journals

citations
Cited by 48 publications
(26 citation statements)
references
References 22 publications
0
26
0
Order By: Relevance
“…Calibration of AR devices has been dealt with by Canessa et al (2014) for the case of a color camera, Liu et al (2016) for an AR guiding system, Kellner et al (2012) for an HMD, and Eck et al (2015), Itoh et al (2015) and Moser et al (2015) for OSTHDMs. Ergonomic issues were instead examined by Schega et al (2014), who evaluated the effect of different HMDs on visual performance of the users, and Tuma et al (2016) who used AR to evaluate the ergonomic state of a workplace.…”
Section: Technical Papersmentioning
confidence: 99%
“…Calibration of AR devices has been dealt with by Canessa et al (2014) for the case of a color camera, Liu et al (2016) for an AR guiding system, Kellner et al (2012) for an HMD, and Eck et al (2015), Itoh et al (2015) and Moser et al (2015) for OSTHDMs. Ergonomic issues were instead examined by Schega et al (2014), who evaluated the effect of different HMDs on visual performance of the users, and Tuma et al (2016) who used AR to evaluate the ergonomic state of a workplace.…”
Section: Technical Papersmentioning
confidence: 99%
“…Based on this initial undistortion map, the authors calibrate the intrinsics and extrinsic parameters of the depth-RGB pair by means of a checkerboard. Similarly, Canessa et al [13] propose a method for calibrating a depth and RGB pair by using a checkerboard with the RGB and IR camera of the sensor. To this end they disable the IR projector and use a light source that is visible in both images.…”
Section: Related Workmentioning
confidence: 99%
“…The stereo image pairs are acquired at a frame rate of ≈14 fps, and the vergence command is applied to the azimuth motors at the same frequency. The depth of the stimulus with respect to the stereo head is measured with a Microsoft Kinect sensor device, precisely calibrated for the task at hand [49]. The actual depth of the fixation point (red solid line) with respect to the ground truth of the stimulus (black dotted line), is estimated by the position of the azimuth axes provided by the magnetic encoders of the motors, returned by the robot head.…”
Section: Learning On Real Stereo Image Pairsmentioning
confidence: 99%