2021
DOI: 10.14763/2021.4.1600
|View full text |Cite
|
Sign up to set email alerts
|

Naming something collective does not make it so: algorithmic discrimination and access to justice

Abstract: The article problematises the ability of procedural law to address and correct algorithmic discrimination. It argues that algorithmic discrimination is a collective phenomenon, and therefore legal protection thereof needs to be collective. Legal procedures are technologies and design objects that embed values that can affect their usability to perform the task they are built for. Drawing from science and technology studies (STS) and feminist critique on law, the article argues that procedural law fails to addr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 39 publications
0
4
0
Order By: Relevance
“…235, 236, 237, 247). This approach represents the goal of ex-ante oversight (Hakkarainen, 2021). It was codified in law by the GDPR, through obligations to implement DPbD (Arts.…”
Section: Randd Phase Matters For Human Rights Compliancementioning
confidence: 99%
“…235, 236, 237, 247). This approach represents the goal of ex-ante oversight (Hakkarainen, 2021). It was codified in law by the GDPR, through obligations to implement DPbD (Arts.…”
Section: Randd Phase Matters For Human Rights Compliancementioning
confidence: 99%
“…It must be acknowledged that the rights-based approach, which the HRBA represents, is not the only means to mitigate harms associated with smart cities or foundations to develop imaginaries of urban futures. HRBAs can be criticised, for example, for neglecting collective dimensions and structural inequalities (Karp-pinen & Puukko, 2020;Yeung, 2019;Smuha, 2021a), atmospheric impacts of compliant smart city technologies (Galič & Gellert, 2021), ethical and societal implications of technology (Mantelero, 2018) being too western, narrow or abstract (Smuha, 2021b), shortcomings of available remedies (Hacker, 2018;Hakkarainen, 2021;Kosta, 2022) and misaligned typologies of harm with respect to risks posed by AI (Teo, 2022). Smart city technologies can also be governed by relying on alternative normative standards, such as welfare and democracy (Karppinen & Puukko, 2020), justice-based approaches (Karppinen & Puukko, 2020;Taylor, 2017) and consumer privacy governance in US law (Jones, 2017;Guay & Birch, 2022;Solove Khan, 2019), along with governance of data as a right of speech in the US (Balkin, 2015) and the capabilities-based approach (Sen, 1993;Nussbaum, 1997;Alexander, 2004).…”
Section: Human Rights-based Approach In Citiesmentioning
confidence: 99%
“…For example, traditional redress mechanisms are not effortlessly suited to provide legal protection in novel types of conflicts, such as algorithmic discrimination. Furthermore, it is difficult if not impossible to translate fairness and justice, as they are defined by law, into algorithmic systems (see e.g., Koivisto, 2020 ; Koulu, 2020b ; Hakkarainen, 2021 ; Wachter et al, 2021 ). Moreover, the growing reliance on technology can also amplify the digital divide: for people with no access or knowledge to navigate the digital environment it becomes harder than before to partake in processes leading to important decisions concerning them (see e.g., Rabinovich-Einy and Katsh, 2017 ; Wing, 2018 ; Toohey et al, 2019 ).…”
Section: Algorithmisation In the Context Of Technological And Legal S...mentioning
confidence: 99%