Abstract:Abstract. Trust is conceived as an attitude leading to intentions resulting in user actions involving automation. It is generally believed that trust is dynamic and that a user's prior experience with automation affects future behavior indirectly through causing changes in trust. Additionally, individual differences and cultural factors have been frequently cited as the contributors to influencing trust beliefs about using and monitoring automation. The presented research focuses on modeling human's trust when… Show more
“…A recent effort [21,23] has led to a general measure of trust in automation validated across large populations in three diverse cultures, US, Taiwan and Turkey, as representative of Dignity, Face, and Honor cultures [63]. The Cross-cultural measure of trust is consistent with the three (performance, purpose, process) dimensions of [58,81] and contains two 9 item scales, one measuring the propensity to trust as in [46] and the other measuring trust in a specific system.…”
Section: Instruments For Measuring Trustmentioning
“…A recent effort [21,23] has led to a general measure of trust in automation validated across large populations in three diverse cultures, US, Taiwan and Turkey, as representative of Dignity, Face, and Honor cultures [63]. The Cross-cultural measure of trust is consistent with the three (performance, purpose, process) dimensions of [58,81] and contains two 9 item scales, one measuring the propensity to trust as in [46] and the other measuring trust in a specific system.…”
Section: Instruments For Measuring Trustmentioning
“…Prior to beginning, participants were asked to complete the trust instrument we developed in earlier research [15], [16], Big Five inventory [17] and CVSCALE [IS] measuring initial trust in automation, personality traits, and cultural characteristics respectively. Participants then took an interactive training tutorial (approximately 20 minutes) to practice control operations and, based on the randomly assigned condition, learn all aspects of the simulation with the assisted applications (i.e., target finder or/and conflict detector).…”
Abstract-The use of autonomous systems has been rapidly increasing in recent decades. To improve human-automation interaction, trust has been closely studied. Research shows trust is critical in the development of appropriate reliance on automation. To examine how trust mediates the human-automation relationships across cultures, the present study investigated the influences of cultural factors on trust in automation. Theoretically guided empirical studies were conducted in the U.S., Taiwan and Turkey to examine how cultural dynamics affect various aspects of trust in automation. The results found significant cultural differences in human trust attitude in automation.
“…For a subjective measure, perceived trust was evaluated using an online questionnaire with a 21-point Likert scale (very low, − 10; neutral, 0; very high, + 10). The definition of the perceived trust was designed in this study to be an expectation of the auto-correction system working as expected (for more information for measuring trust in automation, see Chien et al 2014;Jeong et al 2018a;Jian et al 2000). Since trust is a variable between a human and automation, depending on a context, the measured trust in this study is valid only in auto-correction tasks.…”
Although successful automation can bring abundance to people's lives, the prolonged use of unreliable automation causes negative impacts on users. This study aims to examine how prolonged use of an unreliable auto-proofreading system affects users' trust levels and physiological responses. Nineteen native English speakers participated in tasks that correct grammatical errors in each of the 20 sentences in reliable and unreliable proofreading conditions. During the tasks, the participants' electrodermal activities (EDA) were recorded and their perceived trust in the proofreading system was evaluated. As the unreliable autoproofreading system worked improperly, perceived trust decreased gradually, and a noticeably increasing pattern of EDA signals was observed. In contrast, perceived trust increased gradually, and a stable or a decreasing pattern of EDA signals were observed in the reliable auto-proofreading system. Prolonged use of an unreliable system results in aggravating anxiety, causing an increase in distrust and EDA signals. The findings of this study provide empirical data that can be used for designing a fail-safe feature of automation by minimizing a user's anxiety level.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.