2021
DOI: 10.3390/app11073210
|View full text |Cite
|
Sign up to set email alerts
|

Evaluation of the Coherence of Polish Texts Using Neural Network Models

Abstract: Coherence evaluation of texts falls into a category of natural language processing tasks. The evaluation of texts’ coherence implies the estimation of their semantic and logical integrity; such a feature of a text can be utilized during the solving of multidisciplinary tasks (SEO analysis, medicine area, detection of fake texts, etc.). In this paper, different state-of-the-art coherence evaluation methods based on machine learning models have been analyzed. The investigation of the effectiveness of different m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
6
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 15 publications
0
6
0
Order By: Relevance
“…The unique long-term memory properties of LSTM and GRU neural networks made them widely popular in a large variety of machine learning tasks. Example applications of the LSTM architecture are: data classification [ 22 ], speech recognition [ 23 , 24 ], handwriting recognition [ 25 ], speech synthesis [ 26 ], text coherence tests [ 27 ], biometric authentication and anomaly detection [ 28 ], detecting deception from gaze and speech [ 29 ] and anomaly detection [ 30 ]. Similarly, example applications of the GRU structure are: facial expression recognition [ 31 ], human activity recognition [ 32 ], cyberbullying detection [ 33 ], defect detection [ 34 ], human activity surveillance [ 35 ], automated classification of cognitive workload tasks [ 36 ] and speaker identification [ 37 ].…”
Section: Introductionmentioning
confidence: 99%
“…The unique long-term memory properties of LSTM and GRU neural networks made them widely popular in a large variety of machine learning tasks. Example applications of the LSTM architecture are: data classification [ 22 ], speech recognition [ 23 , 24 ], handwriting recognition [ 25 ], speech synthesis [ 26 ], text coherence tests [ 27 ], biometric authentication and anomaly detection [ 28 ], detecting deception from gaze and speech [ 29 ] and anomaly detection [ 30 ]. Similarly, example applications of the GRU structure are: facial expression recognition [ 31 ], human activity recognition [ 32 ], cyberbullying detection [ 33 ], defect detection [ 34 ], human activity surveillance [ 35 ], automated classification of cognitive workload tasks [ 36 ] and speaker identification [ 37 ].…”
Section: Introductionmentioning
confidence: 99%
“…The first one is based on Semantic Similarity Graph (SSG); the second one is based on Long Short Term Memory (LSTM); and the third one is based on BERT. The results show that the neural network related methods offer better accuracy than the SSG related methods; and within the neural networks, although the LSTM based method shows better accuracy compared to the BERT based method, it is emphasized that the latter can increase the value of said metric with an additional Fine-Tuned [4].…”
Section: Related Workmentioning
confidence: 95%
“…Coherence also implies the type of informational and semantic connectivity that a text possesses [3]. A text is considered coherent if it is semantically consistent and provides cognitive integrity [4], therefore, a coherent document is easier to understand than an incoherent document. Coherence is more important when analyzing scientific papers, as it must communicate information effectively to reviewers and researchers.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Currently, artificial intelligence is an attractive research topic. Classes of networks such as multilayer perceptron networks (MLPs), convolutional neural networks (CNNs), and recurrent neural networks (RNNs) provide much flexibility and have proven themselves over decades to be useful and reliable in various areas [34][35][36][37][38][39][40][41][42]. RNNs were designed to work with sequence prediction problems, which are traditionally difficult to train and not appropriate for tabular datasets.…”
Section: Introductionmentioning
confidence: 99%