2021
DOI: 10.1002/mar.21498
|View full text |Cite
|
Sign up to set email alerts
|

When do you trust AI? The effect of number presentation detail on consumer trust and acceptance of AI recommendations

Abstract: When do consumers trust artificial intelligence (AI)? With the rapid adoption of AI technology in the field of marketing, it is crucial to understand how consumer adoption of the information generated by AI can be improved. This study explores a novel relationship between number presentation details associated with AI and consumers' behavioral and evaluative responses toward AI. We theorized that consumer trust would mediate the preciseness effect on consumer judgment and evaluation of the information provided… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

5
89
2
1

Year Published

2021
2021
2023
2023

Publication Types

Select...
8

Relationship

2
6

Authors

Journals

citations
Cited by 86 publications
(97 citation statements)
references
References 91 publications
(130 reference statements)
5
89
2
1
Order By: Relevance
“…The present research makes several key contributions. First, we confirm the pivotal role of trust in AI adoption [ 16 , 23 , 24 , 38 ], particularly in the context of medical decisions [ 13 15 ], here in the context of a global health pandemic. Our results suggest that a mere one unit increase in the measure of perceived trust in the medical AI results in a seven-fold increase in the likelihood of people choosing medical AI over a human physician.…”
Section: Discussionsupporting
confidence: 66%
See 1 more Smart Citation
“…The present research makes several key contributions. First, we confirm the pivotal role of trust in AI adoption [ 16 , 23 , 24 , 38 ], particularly in the context of medical decisions [ 13 15 ], here in the context of a global health pandemic. Our results suggest that a mere one unit increase in the measure of perceived trust in the medical AI results in a seven-fold increase in the likelihood of people choosing medical AI over a human physician.…”
Section: Discussionsupporting
confidence: 66%
“…This is generalized by research showing that AI algorithms are preferred over humans in decisions in which utilitarian goals are activated [ 20 ], suggesting the compensatory nature of medical AI adoption, and that “people may prefer algorithmic to human judgment […] when interacting with the provider might endanger his or her life (i.e., triage services for COVID-19)” [ 21 , p. 447]. In addition, several studies suggest that the adoption of AI agents depends on psychological constructs, such as consumer trust [ 22 24 ], and personality traits, such as open-mindedness [ 25 ]; yet, mistrust in medical AI has never been tested in the context of a global health pandemic, nor the potential role of other drivers, such as mistrust in human physicians, and perceived uniqueness neglect from human physicians. Indeed, people who experience various forms of marginalization may develop mistrust towards other humans’ decisions [ 26 ], and may even increase their interactions with artificial products that mimic human intelligence [ 27 ].…”
Section: Introductionmentioning
confidence: 99%
“…We also find that there is no difference in perceptions of personalisation, meaning that an AI influencer is perceived as being as able to personalise – content or recommendations – as a human influencer. We propose that this is a spill-over effect from consumers becoming increasingly comfortable with AI recommendation systems (Kim et al , 2021), as in the case of Netflix, which are seen as being able to learn from behaviour and make intelligent recommendations. It may be that developers of AI influencers need to play up this element as a key competitive advantage of AI influencers.…”
Section: Discussionmentioning
confidence: 99%
“…The fact that there is no difference in personalisation means an AI influencer is seen as being able to personalise content or recommendations in a manner like a human influencer. This is interesting and likely driven by consumers becoming increasingly comfortable with AI recommendation systems (Kim et al , 2021), it may be that the perceived efficacy of such systems spill over to other domains of AI, such as AI influencers. We also find that WOM intentions are higher for an AI influencer regardless of agency.…”
Section: Experimental Studiesmentioning
confidence: 99%
“…Second, recent advances in digital transformation, such as artificial intelligence, machine learning and big data, have fueled the proliferation of digital marketing practices. These advances have not only affected how consumers live but also changed how firms do business and interact with consumers, including retailing (Kim et al , 2021a; Reinartz et al , 2019), business strategy (Huang and Rust, 2021; Verhoef et al , 2021), service (Zaki, 2019) and personal engagement (Kumar et al , 2019). The leading global and local firms must prioritize “digital transformation” for their businesses.…”
mentioning
confidence: 99%