2016
DOI: 10.1111/j.1740-9713.2016.00960.x
|View full text |Cite
|
Sign up to set email alerts
|

To Predict and Serve?

Abstract: Predictive policing systems are used increasingly by law enforcement to try to prevent crime before it occurs. But what happens when these systems are trained using biased data? Kristian Lum and William Isaac consider the evidence – and the social consequences

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
303
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 400 publications
(323 citation statements)
references
References 13 publications
2
303
0
Order By: Relevance
“…Second, the chosen target variable might be measured less accurately for certain groups. For example, arrests are often used as a proxy for crime in applications of machine learning to policing and criminal justice, even though arrests are a racially biased representation of the true incidence of crime [28]. In treating arrests as a reliable proxy for crime, the model learns to replicate the biased labels in its predictions.…”
Section: Introductionmentioning
confidence: 99%
“…Second, the chosen target variable might be measured less accurately for certain groups. For example, arrests are often used as a proxy for crime in applications of machine learning to policing and criminal justice, even though arrests are a racially biased representation of the true incidence of crime [28]. In treating arrests as a reliable proxy for crime, the model learns to replicate the biased labels in its predictions.…”
Section: Introductionmentioning
confidence: 99%
“…Training data often reflect societal biases and are not representative of the population (sample bias). Moreover, system bias might lead into generation of biased data which result into biased models that further reinforce such discriminatory policies, like in predictive policing [4].…”
Section: Introductionmentioning
confidence: 99%
“…An experiment conducted by the Human Rights Data Analysis Group shows what this might look like. Applying a simulation of PredPol's algorithm to drug offences in Oakland, California, the study demonstrated that the algorithm would send officers mostly to African American and Latino neighbourhoods, despite evidence from the US 2011 National Survey on Drug Use and Health that drug use is dispersed evenly across all of Oakland's neighbourhoods 9 .…”
Section: The Future Of Predictionmentioning
confidence: 98%