2021
DOI: 10.14763/2021.2.1565
|View full text |Cite
|
Sign up to set email alerts
|

Recommender systems and the amplification of extremist content

Abstract: Policymakers have recently expressed concerns over the role of recommendation algorithms and their role in forming "filter bubbles". This is a particularly prescient concern in the context of extremist content online; these algorithms may promote extremist content at the expense of more moderate voices. In this article, we make two contributions to this debate. Firstly, we provide a novel empirical analysis of three platforms' recommendation systems when interacting with far-right content. We find that one pla… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 37 publications
(20 citation statements)
references
References 40 publications
0
12
0
Order By: Relevance
“…In this context, extreme or novel views, opinions and sentiments are favoured and amplified (Whittaker et al, 2021). For example, one study found that fake news was more novel than real news and spread ‘farther, faster, deeper and more broadly than the truth’ (Vosoughi et al, 2018).…”
Section: Methodsmentioning
confidence: 99%
“…In this context, extreme or novel views, opinions and sentiments are favoured and amplified (Whittaker et al, 2021). For example, one study found that fake news was more novel than real news and spread ‘farther, faster, deeper and more broadly than the truth’ (Vosoughi et al, 2018).…”
Section: Methodsmentioning
confidence: 99%
“…Content is the expressed opinions of other agents ranked by the recommender system for each agent, based on the model's predicted engagement for a given piece of content. Agents decide whether to engage with the content based on two welldocumented biases, namely the bias to engage with similar content (similarity bias or homophily bias) (Mäs and Flache 2013;Dandekar et al 2013) and the bias to engage with extreme content and confident opinions (Penrod and Cutler 1995;Price and Stone 2004;Hegselmann and Krause 2015;Edelson et al 2021;Whittaker et al 2021). An engagement function is defined as:…”
Section: Agentsmentioning
confidence: 99%
“…Empirical evidence shows that misinformation and extreme online views tend to be more engaging than accurate or moderate content (Edelson et al 2021). Recommender systems seem critical in promoting extreme content over moderate ones (Whittaker et al 2021). Partisan content tends to remain confined in insulated clusters of users, thus reducing the opportunity to encounter cross-cutting content (Bakshy et al 2015).…”
mentioning
confidence: 99%
“…In quarantine, people turned to their screens for information and entertainment. Google, YouTube, and social media algorithms funneled users toward more extremist content (Whittaker et al, 2021). People fell down conspiracy-theory rabbit holes looking for answers.…”
Section: Pandemic Separation and Political Distortionmentioning
confidence: 99%