2020
DOI: 10.1080/1369118x.2020.1803946
|View full text |Cite
|
Sign up to set email alerts
|

What they do in the shadows: examining the far-right networks on Telegram

Abstract: The present paper contributes to the research on the activities of farright actors on social media by examining the interconnections between far-right actors and groups on Telegram platform using network analysis. The far-right network observed on Telegram is highly decentralized, similarly to the far-right networks found on other social media platforms. The network is divided mostly along the ideological and national lines, with the communities related to 4chan imageboard and Donald Trump's supporters being t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
101
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 132 publications
(125 citation statements)
references
References 35 publications
0
101
0
Order By: Relevance
“…To gain a better understanding of how malicious content spreads, we begin by creating a map of the network of online hate communities across six social media platforms. We include actively moderated mainstream platforms—Facebook, VKontakte, and Instagram—that have and enforce (to varying degrees) policies against hate speech, as well as the less-moderated platforms Gab 26 , Telegram 27 , and 4Chan 28 . We focus on the distinction between actively moderated and less-moderated platforms while acknowledging that they also vary in other important ways that are outside the scope of this paper: for example, platforms also vary in terms of whether or not posted content is removed after a certain length of time and whether or not posts are identified as linked to specific user accounts.…”
Section: Design and Resultsmentioning
confidence: 99%
“…To gain a better understanding of how malicious content spreads, we begin by creating a map of the network of online hate communities across six social media platforms. We include actively moderated mainstream platforms—Facebook, VKontakte, and Instagram—that have and enforce (to varying degrees) policies against hate speech, as well as the less-moderated platforms Gab 26 , Telegram 27 , and 4Chan 28 . We focus on the distinction between actively moderated and less-moderated platforms while acknowledging that they also vary in other important ways that are outside the scope of this paper: for example, platforms also vary in terms of whether or not posted content is removed after a certain length of time and whether or not posts are identified as linked to specific user accounts.…”
Section: Design and Resultsmentioning
confidence: 99%
“…Furthermore, there is a huge research gap in the usage of other social media tools than Twitter and Facebook by populist actors, such as Telegram, WhatsApp, and Instagram. In particular, Telegram and WhatsApp seem to be interesting cases, as it has already been shown that those tools were extensively (mis)used by right-wing actors (Davis & Straubhaar, 2020;Urman & Katz, 2020).…”
Section: Discussionmentioning
confidence: 99%
“…Hate speech is not restricted to a single social media platform or online group and the hate ecosystem is multi-dimensional, and these wider effects should be taken into account (Johnson et al, 2019). This is especially important as mainstream social media platforms enforce more stringent content moderation policies and users migrate to smaller platforms where exposure patterns to hate speech may be more concentrated (Urman & Katz, 2020). Additionally, in this work we also do not distinguish group membership of those posting hate speech, and therefore there is a chance that some of the hate is directed from out-group members at the platform or ingroup itself.…”
Section: Limitations and Future Researchmentioning
confidence: 99%