2020
DOI: 10.3390/info11110511
|View full text |Cite
|
Sign up to set email alerts
|

Survey of Neural Text Representation Models

Abstract: In natural language processing, text needs to be transformed into a machine-readable representation before any processing. The quality of further natural language processing tasks greatly depends on the quality of those representations. In this survey, we systematize and analyze 50 neural models from the last decade. The models described are grouped by the architecture of neural networks as shallow, recurrent, recursive, convolutional, and attention models. Furthermore, we categorize these models by representa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
8
2

Relationship

1
9

Authors

Journals

citations
Cited by 30 publications
(23 citation statements)
references
References 56 publications
0
23
0
Order By: Relevance
“…The deep language models successfully overcome the issue by replacing static embeddings with contextualized representations. Hence, they enable learning of contextual and taskindependent representations which yielded an improvement in performance on various NLP tasks [47,48]. In recent years, there have been attempts to use BERT-like models for the task of sentiment analysis of social networks messages.…”
Section: Bert-based Language Modelsmentioning
confidence: 99%
“…The deep language models successfully overcome the issue by replacing static embeddings with contextualized representations. Hence, they enable learning of contextual and taskindependent representations which yielded an improvement in performance on various NLP tasks [47,48]. In recent years, there have been attempts to use BERT-like models for the task of sentiment analysis of social networks messages.…”
Section: Bert-based Language Modelsmentioning
confidence: 99%
“…The categorization, shown in Fig 1 , is organized into two classes as follows: (a) the methods proposed for the general domain; and (b) the methods proposed for the biomedical domain. For a more detailed presentation of the methods categorized herein, we refer the reader to several surveys on ontology-based semantic similarity measures [ 43 , 45 ], word embeddings [ 35 , 45 ], sentence embeddings [ 34 , 53 ], and neural language models [ 34 , 54 ].…”
Section: Methods On Sentence Semantic Similaritymentioning
confidence: 99%
“…Similarly, ref. [4] provided a systemization of many language representation models by representation level, input level, model type, model architecture, and model supervision.…”
Section: Related Workmentioning
confidence: 99%