2018
DOI: 10.17645/mac.v6i4.1493
|View full text |Cite
|
Sign up to set email alerts
|

The Moral Gatekeeper? Moderation and Deletion of User-Generated Content in a Leading News Forum

Abstract: Participatory formats in online journalism offer increased options for user comments to reach a mass audience, also enabling the spreading of incivility. As a result, journalists feel the need to moderate offensive user comments in order to prevent the derailment of discussion threads. However, little is known about the principles on which forum moderation is based. The current study aims to fill this void by examining 673,361 user comments (including all incoming and rejected comments) of the largest newspape… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
53
0
9

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 83 publications
(80 citation statements)
references
References 1 publication
3
53
0
9
Order By: Relevance
“…While Tag24 Dresden and B.Z. are regional news outlets (with a Facebook fan base of 100,000 to 135,000), Spiegel Online (with close to 1,9 million Facebook fans) is one of the most widely read German-language news websites; therefore, it has a large number of comments and considerable content moderation practices in place (Boberg et al, 2018). To limit the impact that comment deletion by content moderators might have on the sample while simultaneously allowing for sufficient user engagement, all comments were retrieved within 24 hours after publication.…”
Section: Method Empirical Material and Selection Criteriamentioning
confidence: 99%
“…While Tag24 Dresden and B.Z. are regional news outlets (with a Facebook fan base of 100,000 to 135,000), Spiegel Online (with close to 1,9 million Facebook fans) is one of the most widely read German-language news websites; therefore, it has a large number of comments and considerable content moderation practices in place (Boberg et al, 2018). To limit the impact that comment deletion by content moderators might have on the sample while simultaneously allowing for sufficient user engagement, all comments were retrieved within 24 hours after publication.…”
Section: Method Empirical Material and Selection Criteriamentioning
confidence: 99%
“…Additionally, Completely Automated Public Turing tests to tell Computers and Humans Apart (CAPTCHA) can also be adopted for preventing automated bots from conducting nefarious activities (Sivakorn et al 2016). Apart from the requirements for adequate human, financial and time resources, the journalists' moderation decision is affected by newsroom routines, by the type of media organisation they work for, by societal institutions and the social system in which they operate, their personal experiences or gut feelings (Boberg et al 2018).…”
Section: Participatory Journalismmentioning
confidence: 99%
“…For instance, journalists often are influenced by the availability heuristic, which causes one to tend to favor information that is common or frequent, leading toward a preference for prominent forms of content (Nisbett & Ross, 1980; Shoemaker & Vos, 2009). Because of a consistent lack of clear policies and procedures for handling comments at the organizational level, individual choices of moderators have become more critical for establishing the expectations for forums (Boberg et al, 2018).…”
Section: Gatekeepingmentioning
confidence: 99%
“…News organizations lacked established protocols to guide moderation, extending the literature that online newsrooms provide more flexible production hierarchies (Deuze, 2008) with more journalist autonomy (Boberg et al, 2018). This ranged from an absence of written expectations (respondents Scott and Brady) to loose guidelines for journalist-moderators to follow (Karen, Heidi and Griffon).…”
Section: Organizational Infrastructure For Moderationmentioning
confidence: 99%