DOI: 10.3990/1.9789036537391
|View full text |Cite
|
Sign up to set email alerts
|

Experts and machines united against cyberbullying

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 105 publications
0
10
0
Order By: Relevance
“…As shown by scholars such as Cowie (2013) and Price and Dalgleish (2010), the negative effects of cyberbullying include a lower self-esteem, worse academic achievement, feelings like sadness, anger, fear, depression, and-in extreme cases-cyberbullying could lead to self-harm and suicidal thoughts. As a response to these threats, automated cyberbullying detection has received increased interest resulting in several detection systems (Dinakar et al 2012;Dadvar 2014;Van Hee et al 2015b;Chen, Mckeever, and Delany 2017) and sociological studies investigating the desirability of online monitoring tools (Tucker 2010;Van Royen, Poels, and Vandebosch 2016). In fact, social media users, and specifically teenagers, highly value their privacy and autonomy on social media platforms and underline that priorities must be set related to the detection of harmful content (Van Royen et al 2016).…”
Section: Related Researchmentioning
confidence: 99%
See 2 more Smart Citations
“…As shown by scholars such as Cowie (2013) and Price and Dalgleish (2010), the negative effects of cyberbullying include a lower self-esteem, worse academic achievement, feelings like sadness, anger, fear, depression, and-in extreme cases-cyberbullying could lead to self-harm and suicidal thoughts. As a response to these threats, automated cyberbullying detection has received increased interest resulting in several detection systems (Dinakar et al 2012;Dadvar 2014;Van Hee et al 2015b;Chen, Mckeever, and Delany 2017) and sociological studies investigating the desirability of online monitoring tools (Tucker 2010;Van Royen, Poels, and Vandebosch 2016). In fact, social media users, and specifically teenagers, highly value their privacy and autonomy on social media platforms and underline that priorities must be set related to the detection of harmful content (Van Royen et al 2016).…”
Section: Related Researchmentioning
confidence: 99%
“…Although some studies have investigated rule-based approaches (Reynolds, Kontostathis, and Edwards 2011), the dominant approach to cyberbullying detection involves machine learning, mostly based on supervised (Dinakar, Reichart, and Lieberman 2011;Dadvar 2014) or semisupervised learning (Nahar et al 2014). The former constructs a classifier using labeled training data, whereas semi-supervised approaches rely on classifiers that are built from a small set of labeled and a large set of unlabeled instances.…”
Section: Automated Detection and Analysis Of Cyberbullyingmentioning
confidence: 99%
See 1 more Smart Citation
“…x YouTube 449 4177 [16] .640 [9] [] BRT TWI v Twitter 220 5162 [9] .726 [9] [] BRT TW2 v Twitter 194 2599 [9] .719 [24] [] AMI ASK v Ask.fm 3787 86419 [24] .465 [27] [12]…”
Section: Binary Classificationmentioning
confidence: 99%
“…We expand our work by re-implementing the models on a new dataset. For this purpose, we have used a YouTube dataset which has been extensively used in cyberbullying studies [6], [15], [16]. The ultimate aim was to investigate the interoperability and the performance of the reproduced models on new datasets, to see how adaptable they are to different social media platforms and to what extent models trained on a dataset (i.e., social network) can be transferred to another one.…”
Section: Introductionmentioning
confidence: 99%