2014
DOI: 10.1007/978-3-319-08108-3_12
|View full text |Cite
|
Sign up to set email alerts
|

Having the Final Say: Machine Support of Ethical Decisions of Doctors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 48 publications
0
5
0
Order By: Relevance
“…Computers, robots, algorithms, and other forms of automation are quickly becoming a fundamental part of many decision-making processes in both personal and professional contexts. From forecasting product sales (Fildes, Goodwin, Lawrence, & Nikolopoulos, 2009) to informing medical and management decisions (Esmaeilzadeh, Sambasivan, Kumar, & Nezakati, 2015;Inthorn, Tabacchi, & Seising, 2015;Prahl, Dexter, Braun, & Van Swol, 2013), people frequently seek and receive advice from nonhuman (automation) sources when facing important decisions. Yet, despite seeking advice from automation, decision makers frequently discount advice obtained from it, especially when compared to advice from a human advisor (Önkal, Goodwin, Thomson, Gönül, & Pollock, 2009).…”
Section: Introductionmentioning
confidence: 99%
“…Computers, robots, algorithms, and other forms of automation are quickly becoming a fundamental part of many decision-making processes in both personal and professional contexts. From forecasting product sales (Fildes, Goodwin, Lawrence, & Nikolopoulos, 2009) to informing medical and management decisions (Esmaeilzadeh, Sambasivan, Kumar, & Nezakati, 2015;Inthorn, Tabacchi, & Seising, 2015;Prahl, Dexter, Braun, & Van Swol, 2013), people frequently seek and receive advice from nonhuman (automation) sources when facing important decisions. Yet, despite seeking advice from automation, decision makers frequently discount advice obtained from it, especially when compared to advice from a human advisor (Önkal, Goodwin, Thomson, Gönül, & Pollock, 2009).…”
Section: Introductionmentioning
confidence: 99%
“…This is an individual's tendency to overweight one's own opinion relative to that of an advisor when deciding whether to accept an advice (Harvey and Fischer 1997;Yaniv and Kleinberger 2000; for a review see Bonaccio and Dalal 2006). In addition, even though individuals seek automated advice to aid important high-stakes decisions in several contexts, including medical ones (e.g., Esmaeilzadeh et al 2015;Inthorn, Tabacchi, and Seising 2015;), they suffer from "algorithm aversion." This is a tendency to irrationally, and systematically, discount advice that is generated automatically and communicated by computer algorithms (e.g., Dietvorst, Simmons, and Massey 2015;Goodwin, Gönül, and Önkal 2013).…”
mentioning
confidence: 99%
“…Computers and algorithms become increasing important components of decision-making processes (Esmaeilzadeh et al 2015;Inthorn et al 2015). Although individuals consistently rely on technological support to make decisions, they tend to rely less on algorithm-generated information than on human-generated information (Önkal et al 2009).…”
Section: Hypothesismentioning
confidence: 99%