CHI Conference on Human Factors in Computing Systems Extended Abstracts 2022
DOI: 10.1145/3491101.3519771
|View full text |Cite
|
Sign up to set email alerts
|

How to Train a (Bad) Algorithmic Caseworker: A Quantitative Deconstruction of Risk Assessments in Child Welfare

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(16 citation statements)
references
References 30 publications
0
16
0
Order By: Relevance
“…There are practice models... that help [workers] explore some very concrete, specific questions that help force them not to just make decisions based on their own hunch. " However, other participants said these low-tech tools had built-in biases (see [116]) and workers frequently manipulate them (against their training) to produce any desired output (P1,P5,P6,P7,P9,P15,P16). 19 Some said diagnostic checklists could be used better if workers were better trained and held accountable to follow the training (P5,P10,P14).…”
Section: No-tech and Low-tech Alternatives To Prmsmentioning
confidence: 99%
“…There are practice models... that help [workers] explore some very concrete, specific questions that help force them not to just make decisions based on their own hunch. " However, other participants said these low-tech tools had built-in biases (see [116]) and workers frequently manipulate them (against their training) to produce any desired output (P1,P5,P6,P7,P9,P15,P16). 19 Some said diagnostic checklists could be used better if workers were better trained and held accountable to follow the training (P5,P10,P14).…”
Section: No-tech and Low-tech Alternatives To Prmsmentioning
confidence: 99%
“…1 Studies have found that value conflicts arise when the logics embedded within the government's digital platforms do not align with street-level bureaucrats' discretion when they tried enacting the same shared values in practice [37,61,66,126]. Research within public administration, science and technology studies, and human-computer interaction have recently drawn attention to how the digitization of public services is leading to distinct changes in street-level bureaucrats' discretion [21], power asymmetries between public officials and citizens [106,107,114], need for re-skilling of public officials [8,61,66], and actor transparency in government decision-making [48]. Lindgren et al [73] go a step further and argue that "public officials can no longer be understood as merely human".…”
Section: Algorithmic Decision-making In the Public Sectormentioning
confidence: 99%
“…They predict "decreased bargaining and discretionary power of governmental workers" as one of the early warning signs of this phenomenon. These changes in professional work practices through the adoption of digital tools have similar, yet more serious implications for the adoption of algorithmic tools which further shift discretion away from public service workers; instances of which are already being captured by recent studies in the public sector [66,105].…”
Section: Algorithmic Decision-making In the Public Sectormentioning
confidence: 99%
See 2 more Smart Citations