2020
DOI: 10.1007/s12369-020-00688-z
|View full text |Cite
|
Sign up to set email alerts
|

Is the Social Desirability Effect in Human–Robot Interaction overestimated? A Conceptual Replication Study Indicates Less Robust Effects

Abstract: The “Computers are social actors” (CASA) assumption (Nass and Moon in J Soc Issues 56:81–103, 2000. https://doi.org/10.1111/0022-4537.00153) states that humans apply social norms and expectations to technical devices. One such norm is to distort one’s own response in a socially desirable direction during interviews. However, findings for such an effect are mixed in the literature. Therefore, a new study on the effect of social desirability bias in human–robot evaluation was conducted, aiming for a conceptual r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
15
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 14 publications
(16 citation statements)
references
References 61 publications
0
15
1
Order By: Relevance
“…These situational aspects include but are not limited to an assured anonymity of response, decreased or no fear of reprisals for answering accurately, and decreased pressure from social-desirability bias ( Durmaz et al., 2020 ; Short et al., 2009 ), i.e., the tendency of respondents “to stretch the truth in an effort to make a good impression” ( Martin and Nagao, 1989 , p. 72). Moreover, although it is intuitive that computer-mediated solicitation of user-response data could decrease social desirability bias, the results are actually mixed ( Booth-Kewley et al., 1992 ; Lautenschlager and Flaherty, 1990 ; Leichtmann and Nitsch, 2020 ). Lautenschlager and Flaherty (1990) found that social desirability bias increased when participants used computers to self-report compared to pen and paper, but Booth-Kewley et al.…”
Section: Discussion and Future Researchmentioning
confidence: 99%
“…These situational aspects include but are not limited to an assured anonymity of response, decreased or no fear of reprisals for answering accurately, and decreased pressure from social-desirability bias ( Durmaz et al., 2020 ; Short et al., 2009 ), i.e., the tendency of respondents “to stretch the truth in an effort to make a good impression” ( Martin and Nagao, 1989 , p. 72). Moreover, although it is intuitive that computer-mediated solicitation of user-response data could decrease social desirability bias, the results are actually mixed ( Booth-Kewley et al., 1992 ; Lautenschlager and Flaherty, 1990 ; Leichtmann and Nitsch, 2020 ). Lautenschlager and Flaherty (1990) found that social desirability bias increased when participants used computers to self-report compared to pen and paper, but Booth-Kewley et al.…”
Section: Discussion and Future Researchmentioning
confidence: 99%
“…The studies presented in this paper represent an iterative study processes that promotes conceptual replication. Lack of replication is a critical area to address in HRI studies ( Irfan et al, 2018 ; Hoffman and Zhao, 2020 ; Leichtmann and Nitsch, 2020b ; Ullman et al, 2021 ). Through conceptual replication, this paper demonstrates how consistent the results are for VUIs with higher social embodiment.…”
Section: Discussionmentioning
confidence: 99%
“…In recent years, conversations around replicability in HRI have grown as researchers increasingly seek to understand if there is a replication crisis in HRI ( Irfan et al, 2018 ; Belhassein et al, 2019 ; Hoffman and Zhao, 2020 ; Leichtmann and Nitsch, 2020b ; Ullman et al, 2021 ). It is critical for the HRI community to address this area as replication crisis can create fundamental problems of trust that can “cast serious doubt on previous research and undermine the public’s trust in research studies in general” ( Hoffman and Zhao, 2020 ).…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations