2018
DOI: 10.30853/filnauki.2018-12-1.16
|View full text |Cite
|
Sign up to set email alerts
|

Reader’s Comment in on-Line Magazine as a Genre of Internet Discourse (By the Material of the German and Russian Languages)

Abstract: В статье описывается читательский комментарий в электронной версии журнала как жанр интернет-дискурса (на материале немецкого и русского языков), даются определения таким понятиям, как читательский комментарий, интернет-дискурс, устанавливается положение читательского комментария в виртуальной жанровой системе, а также выделяются его характерные черты. Авторами проводится сравнительный анализ читательских комментариев на немецком и русском языках, выявляются сходства и различия в употреблении языковых средств … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 1 publication
0
3
0
Order By: Relevance
“…While a significant amount of studies examined toxic and aggressive behaviour in Russian-language social media source [7], [33], [41], there are a limited amount of research papers directly exploring the automatic classification of toxicity of the texts. Gordeev utilized Convolutional Neural Networks (CNNs) and Random Forest Classifier (RFC) for detecting the state of aggression in English-language and Russianlanguage texts [17].…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…While a significant amount of studies examined toxic and aggressive behaviour in Russian-language social media source [7], [33], [41], there are a limited amount of research papers directly exploring the automatic classification of toxicity of the texts. Gordeev utilized Convolutional Neural Networks (CNNs) and Random Forest Classifier (RFC) for detecting the state of aggression in English-language and Russianlanguage texts [17].…”
Section: Related Workmentioning
confidence: 99%
“…We used pre-trained Multilingual BERT BASE Cased and ru-BERT, which support 104 languages, including Russian, with 12 stacked Transformer blocks, a hidden size of 768, 12 self-attention heads, and 110M parameters in general. The fine-tuning stage was performed with the recommended parameters from the paper [43] and the official repository 7 : a number of train epochs of 3, a number of warmup steps of 10%, a max sequence length of 128, a batch size of 32, and a learning rate of 5e-5.…”
Section: Bidirectional Encoder Representations From Transformersmentioning
confidence: 99%
See 1 more Smart Citation