2017
DOI: 10.1177/2056305117718468
|View full text |Cite
|
Sign up to set email alerts
|

“Information Warfare” and Online News Commenting: Analyzing Forces of Social Influence Through Location-Based Commenting User Typology

Abstract: While most of the online participation research assumes the Western notion of fulfilling deliberative practices, online contexts have been also found to be an active battleground for so-called information warfare. To test the potential for online comments being used by cross-national political opponents, we analyze the case of online comments on the most active Lithuanian online news portal’s Russian-language edition. This news portal presents itself as a unique case, since Russia was found to engage in a supp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
14
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(14 citation statements)
references
References 51 publications
(105 reference statements)
0
14
0
Order By: Relevance
“…The aim of these "participators" was to influence the Western public and (potentially) the journalists, according to the Russian state goals, in other words, basically a form of covert political propaganda. However, the Internet Research Agency is certainly not the only notable case; the list of similar examples is long (Erjavec & Kovačič, 2012;Zelenkauskaite & Balduccini, 2017) as is the list of presumed actors and groups (Weedon, Nuland, & Stamos, 2017), ranging from state propagandists and political extremists to religious groups and conspiracy theorists all over the globe (see also . Misinformation and propaganda can also take the form of hate campaigns that attack specific groups or individuals that symbolize these groups .…”
Section: Dark Participation: Concept and Systematizationmentioning
confidence: 99%
See 1 more Smart Citation
“…The aim of these "participators" was to influence the Western public and (potentially) the journalists, according to the Russian state goals, in other words, basically a form of covert political propaganda. However, the Internet Research Agency is certainly not the only notable case; the list of similar examples is long (Erjavec & Kovačič, 2012;Zelenkauskaite & Balduccini, 2017) as is the list of presumed actors and groups (Weedon, Nuland, & Stamos, 2017), ranging from state propagandists and political extremists to religious groups and conspiracy theorists all over the globe (see also . Misinformation and propaganda can also take the form of hate campaigns that attack specific groups or individuals that symbolize these groups .…”
Section: Dark Participation: Concept and Systematizationmentioning
confidence: 99%
“…And what's worse, there are even presumed cases of strategic manipulation attempts of community sections by foreign states and related actors (Elliott, 2014). Some observers even regard this as a new information war happening in the guise of user participation (Erjavec & Kovačič, 2012;Zelenkauskaite & Balduccini, 2017). As a result, many news media restricted user participation or even gave up their comment sections altogether.…”
mentioning
confidence: 99%
“…A central strategic instrument of online astroturfing is the manufacturing of user comments designed to appear as authentic citizen voices on highly visible news or social networking sites (SNS). We focus here on this specific form of online astroturfing because it has been one of the most widely debated in the context of national elections across the Western world (Ferrara, 2017; Kovic et al, 2018; Zelenkauskaite and Balduccini, 2017). Examples of targeted campaigns include the 2016 presidential election in the United States (Bessi and Ferrara, 2016; Woolley and Guilbeault, 2017), the 2017 presidential election in France (Ferrara, 2017), and the 2012 presidential elections in South Korea (Keller et al, 2019).…”
mentioning
confidence: 99%
“…Examples of targeted campaigns include the 2016 presidential election in the United States (Bessi and Ferrara, 2016; Woolley and Guilbeault, 2017), the 2017 presidential election in France (Ferrara, 2017), and the 2012 presidential elections in South Korea (Keller et al, 2019). As a key sponsor of these astroturfing activities, various authors have pointed to Russia’s ruling elites (see, for instance, Bugorkova, 2015; Zelenkauskaite and Balduccini, 2017), who are closely tied to an organization known as the Internet Research Agency (IRA) or Russia’s “troll factory” (Lysenko and Brooks, 2018; Ruck et al, 2019). In 2013, this entity employed approximately 600 people with an estimated annual budget of US$ 10 million (Bugorkova, 2015).…”
mentioning
confidence: 99%
“…Another common use of geospatial data is for identifying specific actors and tracking connections between them. Such tasks are particularly common for studies in political communication and/or disinformation online: for instance, Zelenkauskaite and Balduccini (2017) used geospatial data to specify the origins of users commenting on Russian language news portals in Lithuania, whereas Helmus et al (2018) employed geoweb to track the identities of users involved in Russian propaganda and counter-propaganda efforts on Twitter. Disinformation, however, is not the only subject which can be investigated in this context as shown by Smirnov et al (2016), who used geospatial data for identifying friendship networks between youngsters on VK.…”
Section: Location Usementioning
confidence: 99%