2020
DOI: 10.1016/j.obhdp.2020.01.008
|View full text |Cite
|
Sign up to set email alerts
|

Slow response times undermine trust in algorithmic (but not human) predictions

Abstract: Algorithms consistently perform well on various prediction tasks, but people often mistrust their advice. Here, we demonstrate one component that affects people's trust in algorithmic predictions: response time. In seven studies (total N = 1928 with 14,184 observations), we find that people judge slowly generated predictions from algorithms as less accurate and they are less willing to rely on them. This effect reverses for human predictions, where slowly generated predictions are judged to be more accurate. I… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
30
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 53 publications
(34 citation statements)
references
References 33 publications
0
30
0
Order By: Relevance
“…This is not to say that people conflate algorithmic utterances with human ones. On the contrary, research shows that humans use different heuristics to evaluate an algorithm's utterance, compared with a human utterance (Efendic et al, 2020). Parallel points apply even if we think of AI applications as a kind of instrumentation rather than as an author of assertions (Freiman & Miller, 2020).…”
Section: Normative Trust In Aimentioning
confidence: 99%
“…This is not to say that people conflate algorithmic utterances with human ones. On the contrary, research shows that humans use different heuristics to evaluate an algorithm's utterance, compared with a human utterance (Efendic et al, 2020). Parallel points apply even if we think of AI applications as a kind of instrumentation rather than as an author of assertions (Freiman & Miller, 2020).…”
Section: Normative Trust In Aimentioning
confidence: 99%
“…We should not automatically assume that people will make any of these inferences about machines that they perceive as Thinking Slow. For example, when it takes a human a long time to generate a prediction, people trust that prediction morebut they trust the prediction 'less' if it took a 'machine' a long time to generate it [33]. One possibility is that people judge humans and machines using different benchmarks for decision time.…”
Section: Trust In Machinesmentioning
confidence: 99%
“…More specifically, transforming maintenance systems and operations into ones that rely on data-driven technologies bring many challenges. One important challenge concerns the design of (data-driven) maintenance systems that organisational members are willing to trust and use [74,75]. That is, in order to successfully integrate these promising technologies into organisations, it is of critical importance to understand when and why users are hesitant to adopt these new technologies in their daily working routine and how we can stimulate its effective usage.…”
Section: Human Factorsmentioning
confidence: 99%