Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research &Amp; Applications 2016
DOI: 10.1145/2857491.2857530
|View full text |Cite
|
Sign up to set email alerts
|

3D gaze estimation from 2D pupil positions on monocular head-mounted eye trackers

Abstract: Abstract3D gaze information is important for scene-centric attention analysis, but accurate estimation and analysis of 3D gaze in real-world environments remains challenging. We present a novel 3D gaze estimation method for monocular head-mounted eye trackers. In contrast to previous work, our method does not aim to infer 3D eyeball poses, but directly maps 2D pupil positions to 3D gaze directions in scene camera coordinate space. We first provide a detailed discussion of the 3D gaze estimation task and summar… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
27
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
5
2
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 44 publications
(29 citation statements)
references
References 12 publications
0
27
0
Order By: Relevance
“…Mapping gaze onto a 3D target (e.g. a sculpture) is a more computationally demanding challenge, but could be achieved by combining 3D reconstruction algorithms (Moons, Van Gool, and Vergauwen 2010;Lepetit and Fua 2005) with methods for estimating participants' gaze depth (Mansouryar et al 2016;Lee et al 2012) . Furthermore, ideal targets should contain non-symmetric, unique elements with relatively high-spatial frequency information, in order to ensure a sufficient set of keypoints can be identified on the reference image.…”
Section: Dynamic Gaze Mappingmentioning
confidence: 99%
“…Mapping gaze onto a 3D target (e.g. a sculpture) is a more computationally demanding challenge, but could be achieved by combining 3D reconstruction algorithms (Moons, Van Gool, and Vergauwen 2010;Lepetit and Fua 2005) with methods for estimating participants' gaze depth (Mansouryar et al 2016;Lee et al 2012) . Furthermore, ideal targets should contain non-symmetric, unique elements with relatively high-spatial frequency information, in order to ensure a sufficient set of keypoints can be identified on the reference image.…”
Section: Dynamic Gaze Mappingmentioning
confidence: 99%
“…Then gaze directions are represented as the eyeball pseudo-center and gaze vectors. Based on a simplified eye model of a perfect sphere where optical axes coincide with visual axes, back-projection methods [14], [23] estimate the eyeball center and gaze vectors in the coordinate system of the eye camera by back-projecting pupil image ellipses. Transforming the coordinate system into the scene camera introduces six parameters of homogeneous transformation.…”
Section: A Gaze Direction Estimation Using the Eyeball Pseudo-centermentioning
confidence: 99%
“…To avoid confusion, we adopt the eyeball pseudo-center to represent the intersection of all gaze directions. Gaze directions are represented as the eyeball pseudo-center and gaze vectors [13], [14] or 2D mapping points on the imaginary mapping plane [10]. The existing regression-based methods have two limitations that affect the accuracy of 3D gaze estimation.…”
Section: Introductionmentioning
confidence: 99%
“…The literature of using eye and/or head gaze for attention estimation is vast and spans many years. Recently, standard methods to detect gaze directions commonly involve eye trackers [8] or determining head orientation [9], [10]. The information about both head orientation and eye gaze has been linked to a person's focus of attention in the past [11], [12].…”
Section: Behaviour Cues For Attention Estimationmentioning
confidence: 99%