Ethics of Data and Analytics 2022
DOI: 10.1201/9781003278290-38
|View full text |Cite
|
Sign up to set email alerts
|

Bias in Criminal Risk Scores Is Mathematically Inevitable, Researchers Say *

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 37 publications
(24 citation statements)
references
References 0 publications
0
16
0
Order By: Relevance
“…But their reliance on questionable data calls into question their potential to address these problems, particularly those related to data harms. As already noted, several studies point to the intractability of such harms (Angwin and Larson, 2016; Hao and Stray, 2019; Lum and Isaac, 2016; Starr, 2014). Therefore, the frames may evoke positive sociotechnical imaginaries that nevertheless obscure the well-documented social costs of data harms, particularly the reproduction of historical inequalities such as systemic biases and the negative labelling of affected groups (Eaglin, 2017; Ugwudike, 2020).…”
Section: What Explains the Different Frames Identified?mentioning
confidence: 93%
See 2 more Smart Citations
“…But their reliance on questionable data calls into question their potential to address these problems, particularly those related to data harms. As already noted, several studies point to the intractability of such harms (Angwin and Larson, 2016; Hao and Stray, 2019; Lum and Isaac, 2016; Starr, 2014). Therefore, the frames may evoke positive sociotechnical imaginaries that nevertheless obscure the well-documented social costs of data harms, particularly the reproduction of historical inequalities such as systemic biases and the negative labelling of affected groups (Eaglin, 2017; Ugwudike, 2020).…”
Section: What Explains the Different Frames Identified?mentioning
confidence: 93%
“…This can be triggered by predictive policing algorithms that are prompted by racially biased arrest data to designate their geographical region as ‘high crime’ and in need of enhanced policing activity (Ferguson, 2017; Harcourt, 2015; Lum and Isaac, 2016). Minorities are also more vulnerable than others to overprediction of recidivism risks by algorithms relying on similar data (Angwin and Larson, 2016; Eaglin, 2017; Hannah-Moffat, 2018; Hao and Stray, 2019; Ugwudike, 2020).…”
Section: The Critical Scholarship On Data-driven Technologiesmentioning
confidence: 99%
See 1 more Smart Citation
“…The Conditional Distance Correlation (CDC) test (Wang et al, 2015) We apply our methods to a loan application dataset from a fintech company, the adult income dataset from UCI Machine Learning Repository 1 and the COMPAS recidivism data from ProPublica 2 (Angwin and Larson, 2016).…”
Section: Fairness Testmentioning
confidence: 99%
“…However, the community and the authorities have also raised concern that these automatically learned decisions may inherit the historical bias and discrimination from the training data and would cause serious ethical problems when used in practice (Nature Editorial, 2016;Angwin and Larson, 2016;Dwoskin, 2015; Executive Office of the President et al, 2016).…”
Section: Introductionmentioning
confidence: 99%