2017
DOI: 10.7249/rr1744
|View full text |Cite
|
Sign up to set email alerts
|

An Intelligence in Our Image: The Risks of Bias and Errors in Artificial Intelligence

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
42
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 128 publications
(62 citation statements)
references
References 20 publications
(29 reference statements)
0
42
0
Order By: Relevance
“…102 Research on social media networks and communities suggests that the exposure these networks provide to different ideas can be beneficial. Research into how social media usage affects polarization reveals that, among all social media users, polarization increases the least among groups that spend the most time on social media, possibly because these groups are exposed to a wider range of ideas and perspectives, even if not all of these ideas and per-99 Osoba and Welser, 2017. 100 Flaxman, Goel, and Rao, 2016; Dimitar Nikolov, Diego F. M. Oliveira, Alessandro Flammini, and Filippo Menczer, "Measuring Online Social Bubbles," Peer Journal of Computer Science, Vol. 1, 2015. 101 Eytan Bakshy, Solomon Messing, and Lada A. Adamic, "Exposure to Ideologically Diverse News and Opinion on Facebook," Science, Vol.…”
Section: Filters and Algorithmsmentioning
confidence: 99%
“…102 Research on social media networks and communities suggests that the exposure these networks provide to different ideas can be beneficial. Research into how social media usage affects polarization reveals that, among all social media users, polarization increases the least among groups that spend the most time on social media, possibly because these groups are exposed to a wider range of ideas and perspectives, even if not all of these ideas and per-99 Osoba and Welser, 2017. 100 Flaxman, Goel, and Rao, 2016; Dimitar Nikolov, Diego F. M. Oliveira, Alessandro Flammini, and Filippo Menczer, "Measuring Online Social Bubbles," Peer Journal of Computer Science, Vol. 1, 2015. 101 Eytan Bakshy, Solomon Messing, and Lada A. Adamic, "Exposure to Ideologically Diverse News and Opinion on Facebook," Science, Vol.…”
Section: Filters and Algorithmsmentioning
confidence: 99%
“…At the same time, algorithms are programmed by humans and humans are fallible. 71 Errors might occur if an AI programmer fails to adjust for regional or state-wide differences in the type of emergency room cases that are frequent in a locale (e.g., frostbite or heatstroke). 72 The effects of bias, particularly unconscious bias-of sexism, racism, heterosexism, and so forth-may influence the programming people develop.…”
Section: Sustainable Development Of Ai In Health Carementioning
confidence: 99%
“…There is a data diet vulnerability found in much of current autonomous learning systems (Osoba and Welser, 2017). AI systems are typically only as good as the data on which they are trained.…”
Section: National Securitymentioning
confidence: 99%
“…Artificial agents' attention frames can be more flexible, and the available scope often improves with new innovations in information technology. 8 "Big data" (now including data streams from the IoT) (Osoba and Welser, 2017). Our previous discussion of automation bias (Osoba and Welser, 2017) highlighted the documented human tendency to ascribe more credibility to outcomes and decisions produced by artificial agents without accounting for the error and bias risks inherent in these agents.…”
Section: General Themes Identified and Suggestionsmentioning
confidence: 99%
See 1 more Smart Citation