2018
DOI: 10.1177/0018720818811190
|View full text |Cite
|
Sign up to set email alerts
|

Social Conformity Effects on Trust in Simulation-Based Human-Robot Interaction

Abstract: Objective: We investigated the co-acting influences of communication and social conformity on trust in human-robot interaction. Background: Previous work has investigated aspects of the robot, the human, and the environment as influential factors in the human-robot relationship. Little work has examined the conjoint effects of social conformity and communication on this relationship. As social conformity and communication have been shown to affect human-human trust, there are a priori reasons to believe that t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(2 citation statements)
references
References 45 publications
0
2
0
Order By: Relevance
“…In terms of human-related factors, individual differences in personal attitudes and cognition may impact trust, and therefore team performance. Pre-existing negative attitudes toward robots have been shown to coincide with decreased ratings of trust in robots (Volante et al, 2019). Based on this, it is important to understand how attitudes toward robots impact performance and trust with different levels of autonomy.…”
Section: Trust In Human-autonomous Teamsmentioning
confidence: 99%
“…In terms of human-related factors, individual differences in personal attitudes and cognition may impact trust, and therefore team performance. Pre-existing negative attitudes toward robots have been shown to coincide with decreased ratings of trust in robots (Volante et al, 2019). Based on this, it is important to understand how attitudes toward robots impact performance and trust with different levels of autonomy.…”
Section: Trust In Human-autonomous Teamsmentioning
confidence: 99%
“…Systems with sufficient intelligence to operate autonomously are difficult for the human operator to understand and anticipate, implying that tolerating ambiguity and complexity may be important for trust ( Matthews et al, 2016 ). In addition, some systems exert social agency, i.e., they can evaluate the status of the human operator and respond with teaming behaviors such as taking on additional tasks, signaling their goals and intended actions, and communicating their willingness to support the human ( Wang and Lewis, 2007 ; Chen and Barnes, 2014 ; Volante et al, 2019 ). Wynne and Lyons (2018) highlight the need for further research on humans’ perceptions of intelligent robots.…”
Section: Introductionmentioning
confidence: 99%