2019
DOI: 10.3390/socsci8100281
|View full text |Cite
|
Sign up to set email alerts
|

Algorithmic Justice in Child Protection: Statistical Fairness, Social Justice and the Implications for Practice

Abstract: Algorithmic tools are increasingly used in child protection decision-making. Fairness considerations of algorithmic tools usually focus on statistical fairness, but there are broader justice implications relating to the data used to construct source databases, and how algorithms are incorporated into complex sociotechnical decision-making contexts. This article explores how data that inform child protection algorithms are produced and relates this production to both traditional notions of statistical fairness … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
38
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 45 publications
(38 citation statements)
references
References 65 publications
(92 reference statements)
0
38
0
Order By: Relevance
“…The first (hypothetical) client wishes to develop a child abuse screening tool similar to that of the real cases extensively studied and reported on [11,14,15,21,25,36]. This complex case intersects heavily with applications in high-risk scenarios with dire consequences.…”
Section: Smactr: An Internal Audit Frameworkmentioning
confidence: 99%
“…The first (hypothetical) client wishes to develop a child abuse screening tool similar to that of the real cases extensively studied and reported on [11,14,15,21,25,36]. This complex case intersects heavily with applications in high-risk scenarios with dire consequences.…”
Section: Smactr: An Internal Audit Frameworkmentioning
confidence: 99%
“…For instance, Binns (2019: 19) notes that human decision-makers exercise discretion in the use of algorithms according to their own convictions and commitments. Keddell (2019) finds that child protection professionals in New Zealand have reservations regarding working with predictive algorithms they are unable to explain to families, while at the same time they are being held accountable for decisions based on them. This can lead to what Elish (2019) calls 'moral crumple zones', where human actors bear the brunt of malfunctions in automated decision-making procedures they have little or no control over.…”
Section: Counter-indications: Frontline Workmentioning
confidence: 99%
“…For complex decision contexts, such as child protection, where many interrelated factors need to be considered, predictive risk algorithms – including those that can condense vast amounts of information into a single risk score for decision support purposes – promise several advantages. The ability of these algorithms to process large amounts of data in short periods of time, their consistency in variable selection procedures and their adaptability to changing relationships in the data are appealing characteristics that are emphasised by proponents of PRMs (Cuccaro‐Alamin et al, 2017; Chouldechova et al, 2018; Keddell, 2019; Munro, 2019). Proponents of PRMs also assert that they can be used to mitigate bias in human decisions and to improve the accuracy of child protection workers’ decision‐making processes (Chouldechova et al, 2018).…”
Section: Introductionmentioning
confidence: 99%
“…A general limitation of their application in the child welfare context is that PRMs require high‐quality administrative data to provide accurate predictions and often rely on known instances of child abuse or neglect, which do not accurately measure the incidence of child abuse or neglect in the population at large (Eubanks, 2018). PRMs in child welfare systems have been viewed as particularly challenging due to the limitations of available data and the historical, political, legislative and cultural contexts in which child protection systems are embedded (Keddell, 2019; Munro, 2019; Saxena et al, 2020). These entrenched complexities within child protection systems are likely to present dangers for the application of PRMs as data collected by child protection systems are often measured with error, and subject to bias resulting from historic and systemic discrimination of marginalised groups and ethnic minorities (Munro, 2019), including Indigenous families.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation