2021
DOI: 10.1177/20563051211062919
|View full text |Cite
|
Sign up to set email alerts
|

Configuring Fakes: Digitized Bodies, the Politics of Evidence, and Agency

Abstract: This comparative case study analysis used more than 200 examples of audiovisual manipulation collected from 2016 to 2021 to understand manipulated audiovisual and visual content produced by artificial intelligence, machine learning, and unsophisticated methods. This article includes a chart that categorizes the methods used to produce and disseminate audiovisual content featuring false personation as well as the harms that result. The article and the findings therein answer questions surrounding the broad issu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 40 publications
0
5
0
Order By: Relevance
“…In the case of Deep Nostalgia, there are clearly different motivations for using the technology – from MyHeritage’s marketing and extractivism to users’ mnemonic purposes and connection. It is critically important then that we ‘consider the attendant ethical and policy questions’ that our uses of such techniques raise (Paris, 2021:1, see also Hepp et al, 2022), and in the case of Deep Nostalgia, that we do so in relation to social networks also.…”
Section: Remembering the Dead In An Algorithmic Presentmentioning
confidence: 99%
“…In the case of Deep Nostalgia, there are clearly different motivations for using the technology – from MyHeritage’s marketing and extractivism to users’ mnemonic purposes and connection. It is critically important then that we ‘consider the attendant ethical and policy questions’ that our uses of such techniques raise (Paris, 2021:1, see also Hepp et al, 2022), and in the case of Deep Nostalgia, that we do so in relation to social networks also.…”
Section: Remembering the Dead In An Algorithmic Presentmentioning
confidence: 99%
“…A famous example is a decelerated clip of US politician Nancy Pelosi. It was shared on Donald Trump’s Twitter account with the aim to make her appear drunkenly slurring her speech (Paris and Donovan, 2019).…”
Section: Defining and Categorising Visual Disinformationmentioning
confidence: 99%
“…Deepfakes operate on a high level of technological sophistication (see Table 1), as they make use of artificial intelligence to fake someone’s entire audio-visual representation. If both video footage and someone’s voice are artificially generated, Paris and Donovan (2019) speak of a virtual performance , which can be classified as the richest form of visual disinformation (Sundar, 2008). However, there are also examples of deepfake photographs generated with face-swap apps (e.g.…”
Section: Defining and Categorising Visual Disinformationmentioning
confidence: 99%
See 1 more Smart Citation
“…False and misleading information and those spreading it are posed as adversaries to be punished or defeated, implicit with the unstated assumption it is a desirable and necessary return to “civil society” where informational expertise and gatekeeping are unquestioned (Bratich, 2020). This drive towards the “war of restoration” obfuscates that mis‐and disinformation has been present in propaganda and modes of manufacturing consent for centuries, and that everyone's interests have never been appropriately reflected in “civil society” (Bratich, 2020; Paris, 2021).…”
Section: Introductionmentioning
confidence: 99%