2020
DOI: 10.1007/978-3-030-39512-4_152
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of Facial Expressions Explain Affective State and Trust-Based Decisions During Interaction with Autonomy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1
1

Relationship

3
3

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 14 publications
0
8
0
Order By: Relevance
“…Additional information about participants, task descriptions, and measures can be found in Drnec and Metcalfe (2016) and Gremillion et al (2016). Furthermore, additional analysis of the whole dataset can be found in Neubauer et al (2020) and the previously mentioned articles.…”
Section: Testbedmentioning
confidence: 99%
See 2 more Smart Citations
“…Additional information about participants, task descriptions, and measures can be found in Drnec and Metcalfe (2016) and Gremillion et al (2016). Furthermore, additional analysis of the whole dataset can be found in Neubauer et al (2020) and the previously mentioned articles.…”
Section: Testbedmentioning
confidence: 99%
“…Trust is only one of multiple factors that lead to a decision to use automation, but it is also difficult to operationalize (Drnec et al, 2016;Neubauer et al, 2020). Some researchers believe that reliance is a more appropriate basis for understanding human-automation dynamics and more easily operationalized (Hoff & Bashir, 2015;Schaefer et al, 2016).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, eye tracking data was used to measure attention, GSR was used to measure arousal, and heart rate and heart rate variability were used to measure cognitive workload in the interaction between the driver and automated vehicles (Du et al, 2020). Facial expressions were used to understand participants' dimensional emotional states (Zhou, Kong, et al, 2020) and trust in human-automation interaction (Neubauer et al, 2020). Due to the fact that one measure is not able to reliably measure emotion or cognition, many researchers often use multiple measures together.…”
Section: Affect and Cognition Measurementmentioning
confidence: 99%
“…Lee et al found 4 cues, namely touching the face and hands, crossing arms, and leaning back, that together were predictive of lower trust [16]. Neubauer et al showed that face expressivity of basic emotions in tandem with individual factors can be used to model and predict response patterns of users toward automation [17]. Thus, there is much potential in using facial expressions for trust measurement which remains untapped.…”
Section: Introductionmentioning
confidence: 99%