2019
DOI: 10.48550/arxiv.1903.06474
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Ground-Truth Data Set and a Classification Algorithm for Eye Movements in 360-degree Videos

Ioannis Agtzidis,
Mikhail Startsev,
Michael Dorr

Abstract: The segmentation of a gaze trace into its constituent eye movements has been actively researched since the early days of eye tracking. As we move towards more naturalistic viewing conditions, the segmentation becomes even more challenging and convoluted as more complex patterns emerge. The definitions and the well-established methods that were developed for monitor-based eye tracking experiments are often not directly applicable to unrestrained set-ups such as eye tracking in wearable contexts or with head-mou… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 31 publications
0
2
0
Order By: Relevance
“…Together, xations and saccades create the overall pattern of the gaze scan path (Kübler et al 2014;Geisler et al 2020). Detecting those events can be achieved through a probabilistic, velocity, or duration threshold-setting approach (Salvucci and Goldberg 2000;Tafaj et al 2012;Santini et al 2016;Agtzidis et al 2019).…”
Section: Eye-tracking Eventsmentioning
confidence: 99%
See 1 more Smart Citation
“…Together, xations and saccades create the overall pattern of the gaze scan path (Kübler et al 2014;Geisler et al 2020). Detecting those events can be achieved through a probabilistic, velocity, or duration threshold-setting approach (Salvucci and Goldberg 2000;Tafaj et al 2012;Santini et al 2016;Agtzidis et al 2019).…”
Section: Eye-tracking Eventsmentioning
confidence: 99%
“…Salvucci and Goldberg 2000;Agtzidis et al 2019). Table8depicts the duration (∆) and velocity ( ) thresholds used to determine eye-movement events similar to the approach applied by Agtzidis et al (2019) andGao et al (2021).…”
mentioning
confidence: 99%