2020
DOI: 10.5210/spir.v2020i0.11132
|View full text |Cite
|
Sign up to set email alerts
|

‘Coordinated Inauthentic Behaviour’ and Other Online Influence Operations in Social Media Spaces

Abstract: Recently, major social media platforms such as Facebook and Twitter have announced efforts to counter "coordinated inauthentic behaviour." However, scholarly research continues to provide evidence that coordinated human and automated accounts covertly seek to undermine and manipulate public debates on these platforms. Given the difficulties in obtaining data from these platforms to study these influence operations, and the significant challenge of identifying covert malinformation operations, further conceptua… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 9 publications
0
5
0
Order By: Relevance
“…Although tweets from unknown accounts dominated the Twitter dataset, these tweets tended to generate fewer likes and retweets, on average, than those shared by individuals and groups. We note that this amplification pattern is characteristic of so-called spam accounts – which ‘may garner many shares because they produce an abundant amount of low-quality posts that each happen to get a little amplification’ ( Gallagher et al, 2021 ) – or what has been described as ‘coordinated inauthentic behaviour’ ( Keller et al, 2020 ). Indeed, many of these accounts had abnormal and non-human usernames suggesting inauthentic accounts ( Inuwa-Dutse et al, 2018 ).…”
Section: Resultsmentioning
confidence: 97%
“…Although tweets from unknown accounts dominated the Twitter dataset, these tweets tended to generate fewer likes and retweets, on average, than those shared by individuals and groups. We note that this amplification pattern is characteristic of so-called spam accounts – which ‘may garner many shares because they produce an abundant amount of low-quality posts that each happen to get a little amplification’ ( Gallagher et al, 2021 ) – or what has been described as ‘coordinated inauthentic behaviour’ ( Keller et al, 2020 ). Indeed, many of these accounts had abnormal and non-human usernames suggesting inauthentic accounts ( Inuwa-Dutse et al, 2018 ).…”
Section: Resultsmentioning
confidence: 97%
“…Nevertheless, many suspicious accounts were posted on the #australiafire and #bushfireaustralia hashtag. A study found that bot and troll accounts were involved in an "information disorder" campaign exaggerating the role of arson in Australia's wildfires (Keller et al, 2020). The accounts carried out activity similar to past "information disorder" campaigns, such as the coordinated behaviour of Russian trolls during the 2016 US presidential election (Chappel, 2020, January 10; Daume et al, 2023).…”
Section: Examples Of Extreme Events and Disasters A) Social Media Pla...mentioning
confidence: 99%
“…Increasingly, brands are turning digital participation into a core site for their corporate action, speaking more and more directly with consumers online than they would ever engage with offline and imbuing their digital avatars with aspects of personality. At the same time, the digital is also full of inorganic usersbots, automated processes and coordinated inauthentic behaviour (See Keller et al 2020). In so far as the idea of citizenship has never been premised on equal and universal participation of all individuals, then the proliferation of inorganic users in the digital public sphere challenges the notion of digital citizenship as a flat, cohesive structure.…”
Section: From Analogue To Digital Citizenshipmentioning
confidence: 99%