2020
DOI: 10.1111/exsy.12527
|View full text |Cite
|
Sign up to set email alerts
|

Enriched Latent Dirichlet Allocation for Sentiment Analysis

Abstract: One of the main benefits of unsupervised learning is that there is no need for labelled data. As a method of this category, latent Dirichlet allocation (LDA) estimates the semantic relations between the words of the text effectively and can play an important role in solving various issues, including emotional analysis in combination with other parameters. In this study, three novel topic models called date sentiment LDA (DSLDA), author–date sentiment LDA (ADSLDA), and pack–author–date sentiment LDA (PADSLDA) a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 33 publications
(19 citation statements)
references
References 62 publications
(73 reference statements)
0
19
0
Order By: Relevance
“…A smoothing count of 1 is included to avoid taking the logarithm of zero. In the present study, topic_coherency is computed through (18), equal to the average of top-ic_coherency values in Z. Furthermore, a higher value of topic_coherency reflects the better quality of the detected topics.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…A smoothing count of 1 is included to avoid taking the logarithm of zero. In the present study, topic_coherency is computed through (18), equal to the average of top-ic_coherency values in Z. Furthermore, a higher value of topic_coherency reflects the better quality of the detected topics.…”
Section: Resultsmentioning
confidence: 99%
“…is observation shows that modeling the parameters weight and window improves sentiment classification at the documentlevel. According to (18), a lower topic_coherency value suggests that the retrieved subjects are of worse quality than one with a highertopic_coherency. e words in a subject accurately describe the subject and have a stronger association with one another.…”
Section: Evaluation Results According To the Different Number Ofmentioning
confidence: 99%
See 1 more Smart Citation
“…The neural network-based methods use the attention mechanism to assign different semantic weights to words with good experimental results in many downstream tasks, such as LSTM [6], BiLSTM [7] and BERT [8]. Semantic analysis based on attention mechanism has been involved in many works [9][10][11] and can reflect the different weights of words in different texts. The attention mechanism is introduced to obtain different weight of words in order to extract enough key information.…”
Section: Attention-based Semantic Analysismentioning
confidence: 99%
“…Each comment was assigned manually to up to three topics most representative of its content. We preferred manually curated resources over automatically generated (e.g., using topic-modelling techniques, such as latent Dirichlet allocation, LDA, Osmani, et al, 2020), because user-generated content, being informal, is prone to generate a lot of noise and the resulting categories are difficult to control and require additional curation.…”
Section: Sentiment Analysismentioning
confidence: 99%