2021
DOI: 10.1016/j.patter.2021.100205
|View full text |Cite
|
Sign up to set email alerts
|

Algorithmic injustice: a relational ethics approach

Abstract: Summary It has become trivial to point out that algorithmic systems increasingly pervade the social sphere. Improved efficiency—the hallmark of these systems—drives their mass integration into day-to-day life. However, as a robust body of research in the area of algorithmic injustice shows, algorithmic systems, especially when used to sort and predict social outcomes, are not only inadequate but also perpetuate harm. In particular, a persistent and recurrent trend within the literature indicates tha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
139
0
2

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 232 publications
(157 citation statements)
references
References 37 publications
3
139
0
2
Order By: Relevance
“…This requires asking questions such as "Why are we finding these clusters and similarities?" and investigating that further instead of using the patterns that we find to build predictive models (Birhane, 2021). On a more radical way forward McQuillan (2020) puts forward a "non-fascist AI."…”
Section: On Creativitymentioning
confidence: 95%
See 2 more Smart Citations
“…This requires asking questions such as "Why are we finding these clusters and similarities?" and investigating that further instead of using the patterns that we find to build predictive models (Birhane, 2021). On a more radical way forward McQuillan (2020) puts forward a "non-fascist AI."…”
Section: On Creativitymentioning
confidence: 95%
“…Through the application of predictive systems in the social sphere, historically and socially unjust norms, stereotypes, and practices are reinforced. A robust body of research on algorithmic injustice (Benjamin, 2019;Birhane, 2021;Eubanks, 2018) shows that predictive systems perpetuate societal and historical injustice. In a landmark study, Buolamwini and Gebru (2018) evaluated gender classification systems used by commercial industries.…”
Section: Imposed Determinability In Unequal Measuresmentioning
confidence: 99%
See 1 more Smart Citation
“…Safiya Noble, in her book "Algorithms of Oppression: How Search Engines Reinforce Racism (2018), " shows the ways of contemporary racism and sexism manifested in Google. Noble evidenced that the programs that define the results of research, often perceived as objective and devoid of moral values (Birhane & Cummins, 2019), actually reproduce and strengthen a racist and…”
Section: Racism and Sexism In Contemporary Algorithmsmentioning
confidence: 99%
“…Rather, we interrogate what happens when machine learning ''gets it right'' by acting as expected but doing so while reinforcing and exacerbating social and economic inequity. 12 Despite recent debates within the machine learning community showing how resistant some quarters are to acknowledging the social, cultural, and economic dimensions of the field (against the ample evidence from others), 13 the question of how to properly bound which aspects of sociotechnical systems machine learning practitioners have agency upon and responsibility for has become vitally important.…”
mentioning
confidence: 99%