2021
DOI: 10.31234/osf.io/p3b9t
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Human but not robotic gaze facilitates action prediction

Abstract: Interpreting the behaviour of autonomous machines will be a daily activity for future generations. Yet, surprisingly little is currently known about how people ascribe intentions to human-like and non-human-like agents or objects. In a series of six experiments, we compared people’s ability to extract non-mentalistic (i.e., where an agent is looking) and mentalistic (i.e., what an agent is looking at; what an agent is going to do) information from identical gaze and head movements performed by humans, human-li… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 45 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?