2013 Humaine Association Conference on Affective Computing and Intelligent Interaction 2013
DOI: 10.1109/acii.2013.72
|View full text |Cite
|
Sign up to set email alerts
|

Mouse Trajectories and State Anxiety: Feature Selection with Random Forest

Abstract: Do users' mouse activities reveal their affective states, as other bodily expressions such as postures and gestures signal emotions? When people are frustrated while trying to solve a puzzle or math problem, their frustration can be manifested in the way they use a computer mouse, such as pressing a button hard. But when a user is engaged in an innocuous and mundane task, what mouse activities provide a clue to detect affective states? To address these questions, we extracted 134 mouse trajectory variables in … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
6
3
1

Relationship

1
9

Authors

Journals

citations
Cited by 35 publications
(22 citation statements)
references
References 31 publications
(28 reference statements)
0
22
0
Order By: Relevance
“…We found no works that investigate implicit interaction with mediated sensing, in any pose. Works that use mediated sensing to analyse human behaviour typically aim at understanding usage patterns or emotions, for example, by looking at the trajectory of the mouse [Yamauchi 2013] or how users press the buttons on a gamepad [Sykes and Brown 2003]. Therefore, we attribute the lack of studies of implicit user behaviour when interacting with foot-operated devices to the limited number of use cases for such input devices and the small populations to which they are targeted.…”
Section: Discussion and Future Directionsmentioning
confidence: 99%
“…We found no works that investigate implicit interaction with mediated sensing, in any pose. Works that use mediated sensing to analyse human behaviour typically aim at understanding usage patterns or emotions, for example, by looking at the trajectory of the mouse [Yamauchi 2013] or how users press the buttons on a gamepad [Sykes and Brown 2003]. Therefore, we attribute the lack of studies of implicit user behaviour when interacting with foot-operated devices to the limited number of use cases for such input devices and the small populations to which they are targeted.…”
Section: Discussion and Future Directionsmentioning
confidence: 99%
“…Mouse movement features, such as the area under the curve or the deviation from the straight line between the start and end point, were found to be indicative of perceptual and numerical judgment ( Song and Nakayama, 2008 ; Chapman et al, 2010 ; Xiao and Yamauchi, 2014 , Yamauchi and Xiao, 2017 ), semantic categorization ( Dale et al, 2007 ), linguistic judgment ( Spivey et al, 2005 ; Farmer et al, 2007 ), and racial and gender judgment of morphed face pictures ( Freeman and Ambady, 2009 ; Freeman et al, 2010 ). Additionally, mouse movement has been found to be related to attitudinal ambivalence toward certain topics (e.g., abortion) ( Wojnowicz et al, 2009 ; Schneider et al, 2015 ), uncertainty in economic choices ( Calluso et al, 2015 ) as well as general emotional states, such as anxiety ( Yamauchi, 2013 ; Yamauchi and Xiao, 2017 ).…”
Section: Introductionmentioning
confidence: 99%
“…Mouse is another common input device and mouse features are also effective in emotion recognition whether used separately or in combination with keystroke features [4,84]. Depending on the types of mouse events, mouse features can be divided into three categories: click related features (M1~M6), movement related (M7~M19) features, and other (M20, M21) features [45,85]. The most used mouse features are summarized in Table Ⅳ.…”
Section: B: Mouse Featuresmentioning
confidence: 99%