2019
DOI: 10.1177/1071181319631414
|View full text |Cite
|
Sign up to set email alerts
|

Reading the Mind in Robots: How Theory of Mind Ability Alters Mental State Attributions During Human-Robot Interactions

Abstract: This study examined how human-robot interaction is influenced by individual differences in theory of mind ability. Participants engaged in a hallway navigation task with a robot over a number of trials. The display on the robot and its proxemics behavior was manipulated, and participants made mental state attributions across trials. Participant ability in theory of mind was also assessed. Results show that proxemics behavior and robotic display characteristics differentially influence the degree to which indiv… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 27 publications
0
4
0
Order By: Relevance
“…Although the intention perceived in Nio sculptures primarily belongs to a specific sociocultural context (an intention directed toward malevolent spirits attempting to enter the temple door), we suggest that intention is “conveyed” solely through the facial expression, even when the perceiver is unaware of the spiritual narrative behind the scenes. Researchers in human-robot interaction have already exploited in their research this idea of emotion and intention that humans can detect in a computational agent ( Mutlu et al, 2009 ; Kim and Suzuki, 2012 ; Schreck et al, 2019 ). However, in human-robot interaction situations, intentions and emotions are usually expressed through different modalities, either verbal or non-verbal such as body or facial feature movement.…”
Section: From the Icon To The Indexmentioning
confidence: 99%
“…Although the intention perceived in Nio sculptures primarily belongs to a specific sociocultural context (an intention directed toward malevolent spirits attempting to enter the temple door), we suggest that intention is “conveyed” solely through the facial expression, even when the perceiver is unaware of the spiritual narrative behind the scenes. Researchers in human-robot interaction have already exploited in their research this idea of emotion and intention that humans can detect in a computational agent ( Mutlu et al, 2009 ; Kim and Suzuki, 2012 ; Schreck et al, 2019 ). However, in human-robot interaction situations, intentions and emotions are usually expressed through different modalities, either verbal or non-verbal such as body or facial feature movement.…”
Section: From the Icon To The Indexmentioning
confidence: 99%
“…Other interactive protocols showed that humans tend to take into account the behavior and, arguably, internal states of SR when trying to coordinate or synchronize their actions with them during joint task execution [Xu et al 2016, Ciardo et al 2020]. Schreck et al [2019], for instance, evaluated whether a SR's social behavior, the type of social signals it displayed, and the proxemics (how close would the SR get to people) affected the likelihood of ToM-related interpretations. They found that increased experience with a robot through continued interaction decreased the likelihood of mental state attributions, unless the robot showed more socially active behaviors (get close to people when interacting) as well as more human-like expressions, which triggered stable levels of mental state attributions across the experiment.…”
Section: Many Studies Have Examined To What Extent Humans Interpret Social Signals Displayed Bymentioning
confidence: 99%
“…The conventions required for robot navigations should possess safe and understandable behavior. We have further divided it into three areas (Rossi et al, 2020;Schreck et al, 2019;Vega et al, 2019a) i.e. Social behavior, Proxemics and Social robot abilities as discussed below.…”
Section: Social Conventionsmentioning
confidence: 99%