2018
DOI: 10.31228/osf.io/gvczj
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The ghost in the legal machine: algorithmic governmentality, economy, and the practice of law

Abstract: Purpose -This paper investigates algorithmic governmentality -as proposed by Antoinette Rouvroy -specifically in relation to law. It seeks to show how algorithmic profiling can be particularly attractive for those in legal practice, given restraints on time and resources. It deviates from Rouvroy in two ways. Firstly, it argues that algorithmic governmentality does not contrast with neoliberal modes of government, in that it allows indirect rule through economic calculations. Secondly, that critique of such sy… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0
1

Year Published

2018
2018
2022
2022

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 1 publication
(1 reference statement)
0
6
0
1
Order By: Relevance
“…A process of "datafication" is occurring whereby the data collected on people becomes a legitimate way to access, understand and control people (Mayer-Schoeberger and Cukier 2013; Van Dijck 2014). Without delving too deeply, as one of us has argued in a separate study, this environment has mainly been generated by a combination of the nature of digital technologies themselves -where data is often a by-product of their main function, such as HTTP cookies -quasi-legal instruments such as privacy policies, and economic incentives to put data to use such as for targeted marketing (Harkens, 2017). In other words, data production has come to be expected and is 'normal'.…”
Section: The Possibility Of An 'Intelligent Court'mentioning
confidence: 99%
“…A process of "datafication" is occurring whereby the data collected on people becomes a legitimate way to access, understand and control people (Mayer-Schoeberger and Cukier 2013; Van Dijck 2014). Without delving too deeply, as one of us has argued in a separate study, this environment has mainly been generated by a combination of the nature of digital technologies themselves -where data is often a by-product of their main function, such as HTTP cookies -quasi-legal instruments such as privacy policies, and economic incentives to put data to use such as for targeted marketing (Harkens, 2017). In other words, data production has come to be expected and is 'normal'.…”
Section: The Possibility Of An 'Intelligent Court'mentioning
confidence: 99%
“…What is the autonomy of algorithms, if any? ‐ it is the accountability and the responsibility of algorithms as socio‐technical artifacts that is examined, that of their creators and users, and ultimately, of the balance of power facilitated or caused by algorithms.At the center of this approach is the insight that there are a number of human influences embedded into algorithms, such as criteria choices, training data, semantics, and interpretation (Diakopoulos, , p. 9; Haarkens, , p. 25; Iliadis, , p. 3). Any investigation must therefore consider algorithms as objects of human creation and take into account intent, including that of any group or institutional processes that may have influenced their design.…”
Section: Algorithmic Accountabilitymentioning
confidence: 99%
“…The second is that the sample used in the training set is a good representation of the individuals constituting that population. However, “these assumptions are susceptible to error and bias, although that is precisely what they are intended to negate” (Haarkens, , p. 22). Some subset of these attributes will be appropriate to the decision context while others may not; an example is the inclusion of attributes that are proxies for race in the algorithms used in the COMPAS system, which “wrongly labeled defendants as ‘future criminals’ when they did not commit a crime at twice the rate for black defendants as white defendants” (Martin, , p. 4).…”
Section: Algorithmic Discriminationmentioning
confidence: 99%
“…The life of law is not logic, but experience, the felt necessities of the time and the prevalent moral and political theories, the intuitions of public policy, or even the "prejudices that judges share with their fellow-men" 109 . Law, as a social process, and legal practice, as a productive power of creativity that is used to satisfy needs, produce social norms 110 , imposing purpose over texts, data, or tradition 111 .…”
Section: And How Could Libraries Equalize This Access and Bridge The mentioning
confidence: 99%