2018
DOI: 10.1007/s10115-018-1314-7
|View full text |Cite
|
Sign up to set email alerts
|

Incorporating word embeddings into topic modeling of short text

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
30
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 58 publications
(30 citation statements)
references
References 26 publications
0
30
0
Order By: Relevance
“…Therefore, researchers have improved this model. Gao et al [15] introduce word embedding information and conditional random fields in the topic model to model short text topics. Lu et al [16] introduce recurrent neural networks to identify the relationships between words based on the BTM model.…”
Section: A the Public Opinion Topic Identification Methods Based On mentioning
confidence: 99%
“…Therefore, researchers have improved this model. Gao et al [15] introduce word embedding information and conditional random fields in the topic model to model short text topics. Lu et al [16] introduce recurrent neural networks to identify the relationships between words based on the BTM model.…”
Section: A the Public Opinion Topic Identification Methods Based On mentioning
confidence: 99%
“…They considered only contextual word-occurrence relationships based on the skipgram model, resulting in latent topics represented by bi-gram distributions and word embeddings per topic, which significantly increased the size of topic representations and extra computation cost. Gao et al [36] designed a Conditional Random Field regularized Topic Model (CRFTM), which assigns semantically related words to the same topic more probably. Word embeddings serve as the similarity measure to quantify the semantic correlations among words.…”
Section: Latent Topic Models Utilizing Word Embeddingsmentioning
confidence: 99%
“…Topic modeling approaches working on short texts from social media platforms suffer from data sparsity, noisy words and word sense disambiguation problems. Gao et al [26] addressed the issue of word sense disambiguation by utilizing local and global semantic correlation provided by word embedding model. Conditional random field is used in inference phase for short text modeling.…”
Section: Related Workmentioning
confidence: 99%