2021
DOI: 10.53637/wcgg2401
|View full text |Cite
|
Sign up to set email alerts
|

Addressing Disconnection: Automated Decision-Making, Administrative Law and Regulatory Reform

Abstract: Automation is transforming how government agencies make decisions. This article analyses three distinctive features of automated decision-making that are difficult to reconcile with key doctrines of administrative law developed for a human-centric decision-making context. First, the complex, multi-faceted decision-making requirements arising from statutory interpretation and administrative law principles raise questions about the feasibility of designing automated systems to cohere with these expectations. Sec… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 13 publications
0
3
0
Order By: Relevance
“…In Spain there is a doctrinal majority (for the moment) that does not admit the use of fully automated decisions (without human intervention) when discretionary powers are exercised. Other legal systems that do not admit it at all (Moreno Rebato, 2021;Perry, 2017;Huggins, 2021).…”
Section: Automate (Or Not) the Discretion To Actmentioning
confidence: 99%
“…In Spain there is a doctrinal majority (for the moment) that does not admit the use of fully automated decisions (without human intervention) when discretionary powers are exercised. Other legal systems that do not admit it at all (Moreno Rebato, 2021;Perry, 2017;Huggins, 2021).…”
Section: Automate (Or Not) the Discretion To Actmentioning
confidence: 99%
“…The CJEU seems to implicitly recognize the risk that historically discriminatory policing practices, such as ethnic profiling, can be reflected in databases and consequently influence the assessment of PNR data against such databases (Rosenfeld and Richardson, 2019;EDRI, 2022). This can be especially problematic if these databases are also mined in order to contribute to formulation of predictive criteria that may reflect discriminatory policing practices (Huggins, 2021(Huggins, , p. 1065Gerards and Brouwer, 2022;Thönnes, 2022). Additionally, where databases are made interoperable, as is increasingly the case in the EU, the chance of having access to and relying on biased and discriminatory data within one or more of these databases increases (European Union Agency for Fundamental Rights., 2017, p. 44; Sooriyakumaran and Jegan, 2020, p. 3-5;Statewatch, 2022).…”
Section: Analysis Of the Cjeu's Risk Assessment And Prescribed Safegu...mentioning
confidence: 99%
“…Research has demonstrated that humans often overly rely on the output of algorithmic systems due to cognitive laziness, insufficient skills to challenge the output, and perceptions of superiority or infallibility of algorithms (Huggins, 2021(Huggins, , p. 1067Alon-Barkat and Busuioc, 2023, p. 155). Additionally it has been demonstrated in literature that detrimental outcomes may also emerge in situations where enforcement agents selectively deviate or follow the output of algorithmic systems according to their own biases and discriminatory views (Green, 2022, p. 7, 8;Alon-Barkat and Busuioc, 2023, p. 155, 156;Thönnes, 2023).…”
Section: Analysis Of the Cjeu's Risk Assessment And Prescribed Safegu...mentioning
confidence: 99%