2020
DOI: 10.1007/978-981-15-1420-3_89
|View full text |Cite
|
Sign up to set email alerts
|

Improving Topic Coherence Using Parsimonious Language Model and Latent Semantic Indexing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
1
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
2
1
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 11 publications
0
1
0
Order By: Relevance
“…When the number of topics is high, the generated model tends to overfit, so you must rely on more than just perplexity to judge a model. We will combine this with changes in topic coherence to determine (Dewangan et al 2020).…”
Section: Methodsmentioning
confidence: 99%
“…When the number of topics is high, the generated model tends to overfit, so you must rely on more than just perplexity to judge a model. We will combine this with changes in topic coherence to determine (Dewangan et al 2020).…”
Section: Methodsmentioning
confidence: 99%
“…The analysis used techniques in the field of machine learning and mining text such as word cloud [34,35], mapping latent variables with network connections [36,37], a bag of words [38,39], topic modeling by Latent Semantic Index (LSI) [39,40], emotion recognition (profiler by Plutchik emotions) [41,42] and geomapping [43].…”
Section: Methodsmentioning
confidence: 99%
“…LSI uses a singular value decomposition (SVD) of a large term-document matrix to identify a linear subspace such that the relationship between the term and document are captured. LSI was then adopted by various studies, such as Dewangan et al [23]. However, LSI is limited as the technique does not assign a probability to the topic.…”
Section: Related Workmentioning
confidence: 99%