The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2017
DOI: 10.1016/j.childyouth.2017.06.027
|View full text |Cite
|
Sign up to set email alerts
|

Risk assessment and decision making in child protective services: Predictive risk modeling in context

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
49
0
2

Year Published

2019
2019
2022
2022

Publication Types

Select...
3
3
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 98 publications
(51 citation statements)
references
References 77 publications
0
49
0
2
Order By: Relevance
“…In highly value-charged areas where accuracy costs lives, such as child welfare and abuse, it is common to hear calls after a scandal that a tragedy 'could have been prevented' , or that the 'information needed to stop this was there' . 3 Increased accuracy and the avoidance of human bias, rather than just the scalability and cost-efficiency of automation, is cited as a major driver for the development of machine learning models in high stakes spaces such as these (Cuccaro-Alamin et al 2017).…”
Section: Augmentation Systemsmentioning
confidence: 99%
“…In highly value-charged areas where accuracy costs lives, such as child welfare and abuse, it is common to hear calls after a scandal that a tragedy 'could have been prevented' , or that the 'information needed to stop this was there' . 3 Increased accuracy and the avoidance of human bias, rather than just the scalability and cost-efficiency of automation, is cited as a major driver for the development of machine learning models in high stakes spaces such as these (Cuccaro-Alamin et al 2017).…”
Section: Augmentation Systemsmentioning
confidence: 99%
“…The first (hypothetical) client wishes to develop a child abuse screening tool similar to that of the real cases extensively studied and reported on [11,14,15,21,25,36]. This complex case intersects heavily with applications in high-risk scenarios with dire consequences.…”
Section: Smactr: An Internal Audit Frameworkmentioning
confidence: 99%
“…There is not a single type of use, a single type of algorithm, uniform types of data, nor a single end user impacted by the use of algorithmic risk prediction tools in child protection. In terms of type of use, algorithmic tools can be used either to distribute preventive family support services, in child protection screening decision making, or in risk terrain profiling to predict spatially where child abuse reports might occur (Cuccaro-Alamin et al 2017;Daley et al 2016;van der Put et al 2017). The type of algorithm selected categorises data in algorithm-specific ways to generate graded recommendations or binary flags and can include decision trees or regression methods amongst others, with varying levels of transparency or opacity.…”
Section: Setting the Scene: Algorithms In Contextmentioning
confidence: 99%
“…On the one hand, some argue predictive tools can contribute to the prevention of child abuse and neglect by efficient prediction of future service contact, substantiation or placement, through the triage of large linked datasets, drawing on more data than a human could rapidly and accurately appraise, and can select predictor variables based on predictive power in real-time (Cuccaro-Alamin et al 2017). Particularly at system intake, when human decision-makers have limited information and time (particularly poor conditions for optimum decision-making), algorithms can quickly compute risks of future system contact (Cuccaro-Alamin et al 2017). On the other hand, issues relating to class and ethnic biases in the data used, other sources of variability in the decisions used as data, data privacy implications, the issue of false positives, limited service user consultation and the lack of transparency of algorithmic processes are cited as serious challenges to the use of algorithmic tools in child protection, particularly where the recipients of services experience high levels of social inequalities, marginalisation, and lack of power in the state-family relationship (Keddell 2014(Keddell , 2015a(Keddell , 2016Munro 2019;Eubanks 2017;Dencik et al 2018).…”
Section: Setting the Scene: Algorithms In Contextmentioning
confidence: 99%
See 1 more Smart Citation