Proceedings of the 2018 ACM Symposium on Eye Tracking Research &Amp; Applications 2018
DOI: 10.1145/3204493.3204592
|View full text |Cite
|
Sign up to set email alerts
|

Wearable eye tracker calibration at your fingertips

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…The authors achieved an average accuracy of 2.9°. In one work [ 14 ], users’ hands and fingertips were used as calibration samples. Users only had to point with their finger to various locations in the scene.…”
Section: Introductionmentioning
confidence: 99%
“…The authors achieved an average accuracy of 2.9°. In one work [ 14 ], users’ hands and fingertips were used as calibration samples. Users only had to point with their finger to various locations in the scene.…”
Section: Introductionmentioning
confidence: 99%
“…The aim of this section of the indoor session was to collect data from the three cameras while participants tracked screen markers on a computer screen (see Figure 11 and Figure 12 ). This method is widely used to evaluate eye-tracking techniques [ 10 , 11 , 12 , 13 ]. The participants were asked to sit normally in front of the screen and track the screen marker.…”
Section: Experimentationmentioning
confidence: 99%
“…The authors attained an average accuracy of 2.9 • . In a separate study [12], users' hands and fingertips served as calibration samples, with users simply pointing at various locations in the scene. Although the proposed approach demonstrated accuracy comparable to marker-based calibration techniques, it still necessitated users to direct their gaze to specific points in the scene, achieving an average accuracy of 2.68 • .…”
Section: Introductionmentioning
confidence: 99%
“…Sugano treats mouse clicks on a computer screen as gaze points to train the mapping function between the eye features and PoR [14]. Similar to [14], the algorithm in [15] detects the user's hand and fingertip which indicate the user's point of interest. This method can easily collect calibration samples in different environments quickly, and the proposed method achieves comparable accuracy to standard markerbased calibration.…”
Section: Introductionmentioning
confidence: 99%