2014 IEEE International Inter-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support (CogSIMA 2014
DOI: 10.1109/cogsima.2014.6816556
|View full text |Cite
|
Sign up to set email alerts
|

The influence of modality and transparency on trust in human-robot interaction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
30
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 52 publications
(31 citation statements)
references
References 12 publications
1
30
0
Order By: Relevance
“…Explanations improve usability and let the users understand what is happening, building the users' trust and generating calibrated expectations about the system's capabilities (Westlund and Breazeal, 2016 ). If systems can explain their reasoning, they should be easily understood by their users, and humans are more likely to trust systems that they understand (Sanders et al, 2014 ; Sheh, 2017 ; Fischer et al, 2018 ; Lewis et al, 2018 ).…”
Section: Transparency As Explainabilitymentioning
confidence: 99%
“…Explanations improve usability and let the users understand what is happening, building the users' trust and generating calibrated expectations about the system's capabilities (Westlund and Breazeal, 2016 ). If systems can explain their reasoning, they should be easily understood by their users, and humans are more likely to trust systems that they understand (Sanders et al, 2014 ; Sheh, 2017 ; Fischer et al, 2018 ; Lewis et al, 2018 ).…”
Section: Transparency As Explainabilitymentioning
confidence: 99%
“…In Experiment 1, we are examining the effects of varying levels of information (LOI) provided to a participant by a robotic teammate upon the participant's trust and workload, as well as the influence of the modality of communication (auditory, graphic, or video). Partial results of this work may be found in Sanders et al [68]. In Experiment 2, which is also ongoing, we are again varying LOI and modality and measuring trust workload, but this experiment also includes a malfunction condition wherein the communication between the participant and the robot takes place during incidents of malfunction.…”
Section: Trust and Transparency In Hrimentioning
confidence: 91%
“…The robot's action, communication, and transparency not only can increase task performance according to Lakhmani et al [17], but operators' well-being in terms of mental workload and situation awareness as noted by Hayes and Shah [18]. Furthermore, Sanders et al [15] propose that a consistent and constant flow of information can have a positive impact on trust. Trust is essential for efficient task completion; according to Wright et al [19], too little trust can result in technology rejection, while too much trust can lead to complacency.…”
Section: Related Workmentioning
confidence: 98%
“…From the user point of view any new technology needs to be accepted by the workforce to be effective. Lack of trust can be caused by a lack of transparency in robot behaviour as shown in the work of Sanders et al and Wortham and Theodorou [15,16]. People are likely to feel more comfortable and confident working with a robot if they know how it behaves and can anticipate what it will do next.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation