2021
DOI: 10.48550/arxiv.2108.04134
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Fairness in Algorithmic Profiling: A German Case Study

Christoph Kern,
Ruben L. Bach,
Hannah Mautner
et al.

Abstract: Algorithmic profiling is increasingly used in the public sector as a means to allocate limited public resources effectively and objectively. One example is the prediction-based statistical profiling of job seekers to guide the allocation of support measures by public employment services. However, empirical evaluations of potential side-effects such as unintended discrimination and fairness concerns are rare. In this study, we compare and evaluate statistical models for predicting job seekers' risk of becoming … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 7 publications
(12 citation statements)
references
References 28 publications
0
12
0
Order By: Relevance
“…Currently, the dominant data collection methods are administrative data and questionnaires (Desiere et al. , 2019, p. 12), although advances in machine learning increase the availability of big data (Kern et al. , 2021, p. 2; de Troya et al.…”
Section: Resultsmentioning
confidence: 99%
See 4 more Smart Citations
“…Currently, the dominant data collection methods are administrative data and questionnaires (Desiere et al. , 2019, p. 12), although advances in machine learning increase the availability of big data (Kern et al. , 2021, p. 2; de Troya et al.…”
Section: Resultsmentioning
confidence: 99%
“…7–14) used the Aequitas toolkit to audit bias and discrimination, finding that the particularly developed algorithm “fails the bias audit”, which suggests researchers “should aim to mitigate bias before implementing” real-world profiling models (p. 14). Similarly, Kern et al. (2021) called for “rigorous auditing processes” before implementation, highlighting how “different classification policies have very different fairness implications” (p. 1).…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations