2021
DOI: 10.1016/j.chb.2021.106727
|View full text |Cite
|
Sign up to set email alerts
|

Dual humanness and trust in conversational AI: A person-centered approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
12
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 50 publications
(14 citation statements)
references
References 75 publications
0
12
0
Order By: Relevance
“…D2 suggests that for consumers to perceive an opportunity to interact, robots should have proximity to humans as well as a clear ability to interact. The findings of this study provide insights into the growing body of research on social interactions and voice‐based technologies (Hu et al, 2021; Pitardi et al, 2021; Whang & Im, 2020), revealing a preference for medium or high levels of opportunity for social contact as represented by robots who were nearby humans and potentially able to hold a conversation (D2), but no significant preference for active social interaction with robots in the form of conversation. Placing this finding within previous literature, it is proposed that within this study, some participants saw the robots as human‐like enough to merit ingroup status and the potential for interactions this can bring (Vaes et al, 2003, 2012), while some did not.…”
Section: Discussionmentioning
confidence: 71%
“…D2 suggests that for consumers to perceive an opportunity to interact, robots should have proximity to humans as well as a clear ability to interact. The findings of this study provide insights into the growing body of research on social interactions and voice‐based technologies (Hu et al, 2021; Pitardi et al, 2021; Whang & Im, 2020), revealing a preference for medium or high levels of opportunity for social contact as represented by robots who were nearby humans and potentially able to hold a conversation (D2), but no significant preference for active social interaction with robots in the form of conversation. Placing this finding within previous literature, it is proposed that within this study, some participants saw the robots as human‐like enough to merit ingroup status and the potential for interactions this can bring (Vaes et al, 2003, 2012), while some did not.…”
Section: Discussionmentioning
confidence: 71%
“…One of the most promising AI applications, namely, conversational agents ( Carter, 2018 ), are disruptive innovations attracting extensive attention from practitioners and researchers. Considerable research focuses on the representative form of conversational agents known as chatbots and examines user responses to their human-like attributes ( Hu et al, 2021 , Luo et al, 2019 , Schanke et al, 2021 , Sheehan et al, 2020 ). For instance, Schanke et al (2021) investigated how consumers who want to sell used clothes respond to customer service chatbots.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Trust is frequently mentioned in mutual relationships ( Cheng et al, 2017 , Jiang and Lau, 2021 , Zhao et al, 2020 ), such as the relationship between buyers and sellers in e-commerce ( Lu et al, 2016 ), between community members ( Cheng et al, 2019b ), between demanders and suppliers in ridesharing ( Cheng et al, 2019a , Cheng et al, 2020 ), and between humans and AI agents in human-AI interaction ( Hu et al, 2021 ). It refers to a positive and confident expectation of others’ behaviors ( Mcknight et al, 2002 ).…”
Section: Literature Reviewmentioning
confidence: 99%
See 1 more Smart Citation
“…
Voice commerce in which consumers make purchases using a voice assistant (VA) is on the rise. A VA is a digital agent that uses natural language to communicate with consumers via a human-like voice (Hu et al, 2021). Amazon's Alexa, Google Assistant, and/or Apple's Siri are popular VAs.
…”
mentioning
confidence: 99%