2020
DOI: 10.3390/s20174956
|View full text |Cite
|
Sign up to set email alerts
|

Development and Calibration of an Eye-Tracking Fixation Identification Algorithm for Immersive Virtual Reality

Abstract: Fixation identification is an essential task in the extraction of relevant information from gaze patterns; various algorithms are used in the identification process. However, the thresholds used in the algorithms greatly affect their sensitivity. Moreover, the application of these algorithm to eye-tracking technologies integrated into head-mounted displays, where the subject’s head position is unrestricted, is still an open issue. Therefore, the adaptation of eye-tracking algorithms and their thresholds to imm… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
28
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2
1

Relationship

2
7

Authors

Journals

citations
Cited by 40 publications
(32 citation statements)
references
References 34 publications
(82 reference statements)
2
28
0
Order By: Relevance
“…Three methods from the literature for offline fixation detection are implemented. This includes I-VT using a velocity threshold similar to Reference [ 31 ], I-DT for VR as described by Llanes-Jurado et al [ 32 ] using a dispersion threshold, and I-AOI proposed by Salvucci and Goldberg [ 33 ] based on detected areas of interest. Our implementation of I-VT follows the description by Olsen [ 31 ].…”
Section: Augmented Reality Eye Tracking Toolkitmentioning
confidence: 99%
See 1 more Smart Citation
“…Three methods from the literature for offline fixation detection are implemented. This includes I-VT using a velocity threshold similar to Reference [ 31 ], I-DT for VR as described by Llanes-Jurado et al [ 32 ] using a dispersion threshold, and I-AOI proposed by Salvucci and Goldberg [ 33 ] based on detected areas of interest. Our implementation of I-VT follows the description by Olsen [ 31 ].…”
Section: Augmented Reality Eye Tracking Toolkitmentioning
confidence: 99%
“…We calculate a velocity for each gaze point over a specified duration and categorize the points by comparing the velocities to a specified threshold. I-DT follows the implementation by Llanes-Jurado et al [ 32 ]. It computes the angular dispersion distance over a window of a specific size in terms of its duration.…”
Section: Augmented Reality Eye Tracking Toolkitmentioning
confidence: 99%
“…ET data was processed in order to obtain a classification between fixations and saccades, using the dispersion threshold (DT) algorithm with 1 • as a dispersion threshold and 0.25 s as a time window threshold [83]. A complete set of features was obtained from the classification between fixation and saccade.…”
Section: Data Processingmentioning
confidence: 99%
“…Eye-tracking (ET) features extracted using simple kinematic definitions and data points were classified into fixation and saccades using the algorithm presented in [71]. Moreover, the following signals were defined.…”
Section: Feature Extractionmentioning
confidence: 99%