2020
DOI: 10.1109/thms.2019.2931755
|View full text |Cite
|
Sign up to set email alerts
|

Influence of Culture, Transparency, Trust, and Degree of Automation on Automation Use

Abstract: The reported study compares groups of 120 participants each, from the United States, Taiwan, and Turkey interacting with versions of an automated path planner that vary in transparency and degree of automation. The nationalities were selected in accordance with the theory of Cultural Syndromes as representatives of Dignity (US), Face (Taiwan), and Honor (Turkey) cultures and were predicted to differ in readiness to trust automation, degree of transparency required to use automation, and willingness to use syst… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
4
2

Relationship

1
9

Authors

Journals

citations
Cited by 37 publications
(14 citation statements)
references
References 34 publications
(52 reference statements)
0
14
0
Order By: Relevance
“…This breadth of domains and environments is matched with a similar depth into various individual and group level differences through examining the effects of age (McBride et al, 2010), gender (Pak et al, 2014), personality (Rupp et al, 2018), expertise (Madhavan & Wiegmann, 2007a), culture (Chien et al, 2019), working memory capacity (Rovira et al, 2017) and genetics (Parasuraman et al, 2012) on user trust. As a result, theories and models have been developed which not only predict a user’s level of trust but also how trust interacts with other factors (e.g., workload, SA) to influence human behavior (Chiou & Lee, 2021; de Visser et al, 2020; de Visser, Pak, et al, 2018b; Ghazizadeh et al, 2012; Hoff & Bashir, 2015).…”
Section: The Case For Trust Assessment In Human-machine Interactionmentioning
confidence: 99%
“…This breadth of domains and environments is matched with a similar depth into various individual and group level differences through examining the effects of age (McBride et al, 2010), gender (Pak et al, 2014), personality (Rupp et al, 2018), expertise (Madhavan & Wiegmann, 2007a), culture (Chien et al, 2019), working memory capacity (Rovira et al, 2017) and genetics (Parasuraman et al, 2012) on user trust. As a result, theories and models have been developed which not only predict a user’s level of trust but also how trust interacts with other factors (e.g., workload, SA) to influence human behavior (Chiou & Lee, 2021; de Visser et al, 2020; de Visser, Pak, et al, 2018b; Ghazizadeh et al, 2012; Hoff & Bashir, 2015).…”
Section: The Case For Trust Assessment In Human-machine Interactionmentioning
confidence: 99%
“…Where the conditions determining a partner's actions are not evident, however, attributions involving capabilities, objective functions (goals), or other characteristics of the machine may be needed to establish a basis for cooperation. For example, in Chien et al (2020), compliance doubled when participants were shown why a path planner wanted to re-route a UAV while at the same time an earlier correlation between trust and compliance was eliminated.…”
Section: Automation Versus Autonomymentioning
confidence: 99%
“…Most recent HRI research suggests that individual differences in trust might stem from whether the robot is perceived as an advanced tool, a human-like teammate [11], and/or by cultural differences [12]. Robot errors are found to have a profound and lasting effect on trust [13].…”
Section: Related Work a Trust In Automation And Human-robot Interactionmentioning
confidence: 99%