2023
DOI: 10.1016/j.chbr.2023.100283
|View full text |Cite
|
Sign up to set email alerts
|

How do people respond to computer-generated versus human faces? A systematic review and meta-analyses

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 11 publications
(9 citation statements)
references
References 134 publications
0
9
0
Order By: Relevance
“…We would expect a similar pattern if our experiments were repeated with different levels of auditory noise. While our study only examined speech perception, a similar approach could be taken to compare real and synthetic faces in other domains, such as emotions and looking behavior (Miller et al, 2023).…”
Section: Discussionmentioning
confidence: 99%
“…We would expect a similar pattern if our experiments were repeated with different levels of auditory noise. While our study only examined speech perception, a similar approach could be taken to compare real and synthetic faces in other domains, such as emotions and looking behavior (Miller et al, 2023).…”
Section: Discussionmentioning
confidence: 99%
“…Representing realistic and, at the same time, dynamic and responsive human avatars in virtual environments remains a demanding task (Di Natale et al., 2023; Hepperle et al., 2022). Virtual faces created with today's advanced digital graphics technologies, although appearing remarkably realistic, are still perceived as not fully human, which is reflected at both the behavioral and neural levels ( uncanny valley effect ; Di Natale et al., 2023; Hepperle et al., 2022; Miller et al., 2023; Schindler et al., 2017; Sollfrank et al., 2021).…”
Section: Discussionmentioning
confidence: 99%
“…While our study only examined speech perception, a similar approach could be taken to compare real and synthetic faces in other domains, such as emotions and looking behavior ( Miller et al, 2023 ).…”
Section: Discussionmentioning
confidence: 99%