2023
DOI: 10.1038/s41598-023-31536-5
|View full text |Cite
|
Sign up to set email alerts
|

Predicting choice behaviour in economic games using gaze data encoded as scanpath images

Abstract: Eye movement data has been extensively utilized by researchers interested in studying decision-making within the strategic setting of economic games. In this paper, we demonstrate that both deep learning and support vector machine classification methods are able to accurately identify participants’ decision strategies before they commit to action while playing games. Our approach focuses on creating scanpath images that best capture the dynamics of a participant’s gaze behaviour in a way that is meaningful for… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(1 citation statement)
references
References 58 publications
0
0
0
Order By: Relevance
“…Using pre-trained convolutional networks did not change much during the last 2 years. We found that VGGs pre-trained on ImageNet are still popular; Byrne et al ( 2023a , b ) used VGG-16 and VGG-19, respectively, to predict a user's decision. Fuhl ( 2024 ) proposed a feature extraction network based on ResNet-12 and evaluated it on the Doves (Bovik et al, 2009 ), WherePeopleLook (Judd et al, 2009 ), and Gaze (Dorr et al, 2010 ) datasets.…”
Section: Discussionmentioning
confidence: 99%
“…Using pre-trained convolutional networks did not change much during the last 2 years. We found that VGGs pre-trained on ImageNet are still popular; Byrne et al ( 2023a , b ) used VGG-16 and VGG-19, respectively, to predict a user's decision. Fuhl ( 2024 ) proposed a feature extraction network based on ResNet-12 and evaluated it on the Doves (Bovik et al, 2009 ), WherePeopleLook (Judd et al, 2009 ), and Gaze (Dorr et al, 2010 ) datasets.…”
Section: Discussionmentioning
confidence: 99%