2014
DOI: 10.1007/978-3-319-07632-4_13
|View full text |Cite
|
Sign up to set email alerts
|

Moderation Techniques for Social Media Content

Abstract: Social media are perhaps the most popular services of cyberspace today. The main characteristic of social media is that they offer to every internet user the ability to add content and thus contribute to participatory journalism. The problem in that this content must be checked as far as quality is concerned and in order to avoid legal issues. This can be accomplished with the help of moderation. The problem is that moderation is a complex process that in many cases requires substantial human resources. This p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
16
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 30 publications
(17 citation statements)
references
References 14 publications
0
16
0
Order By: Relevance
“…However, it is omitted that producer and consumer practices in the context of AI technologies may in themselves contradict sustainability goals. Issues such as lithium mining, e-waste, the one-way use of rare earth minerals, energy consumption, low-wage "clickworkers" creating labels for data sets or doing content moderation are of relevance here (Crawford and Joler 2018;Irani 2016;Veglis 2014;Fang 2019;Casilli 2017). Although "clickwork" is a necessary prerequisite for the application of methods of supervised machine learning, it is associated with numerous social problems (Silberman et al 2018), such as low wages, work conditions and psychological work consequences, which tend to be ignored by the AI community.…”
Section: Omissionsmentioning
confidence: 99%
“…However, it is omitted that producer and consumer practices in the context of AI technologies may in themselves contradict sustainability goals. Issues such as lithium mining, e-waste, the one-way use of rare earth minerals, energy consumption, low-wage "clickworkers" creating labels for data sets or doing content moderation are of relevance here (Crawford and Joler 2018;Irani 2016;Veglis 2014;Fang 2019;Casilli 2017). Although "clickwork" is a necessary prerequisite for the application of methods of supervised machine learning, it is associated with numerous social problems (Silberman et al 2018), such as low wages, work conditions and psychological work consequences, which tend to be ignored by the AI community.…”
Section: Omissionsmentioning
confidence: 99%
“…e system first checks the format of the input text, and if error has been detected, the system will not process the text anymore, but will prompt that the input format is incorrect, and the user needs to re-enter the text; after format check is passed, the text content will be pushed to the AI-based TCM system, which will call the API to moderate the text content. If the content check is passed, an approval message will be sent to the system administrator, who can conduct random content check and show the moderated text comments on the webend and app-end [14]. If the text content contains illegal content such as prohibited words, spams, or advertisement and fails to pass the moderation, it will be classified as unapproved type, and then, the AI-based TCM system can clarify the violation type of the text content, prompt alert to the user that the text he/she entered contains illegal content, and point out the violation type, and the relevant information will be recorded in the system log; for the type of the violation that cannot be determined by the system, the text will be pushed to the system administrator for manual moderation.…”
Section: System Operation Processmentioning
confidence: 99%
“…In many situations, this time spent on the work of reviewing might be wasteful and inefficient if it means that moderators have less time to spend contributing to the community in other ways. Furthermore, waiting for an extended amount of time for contributions to be explicitly approved creates "a lack of instant gratification on the part of the participant" which may reduce individuals' desire to contribute again [77].…”
Section: Moderation Design Decisionsmentioning
confidence: 99%