2021
DOI: 10.1109/access.2021.3100299
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Grained Attention Representation With ALBERT for Aspect-Level Sentiment Classification

Abstract: Aspect-level sentiment classification aims to solve the problem, which is to judge the sentiment tendency of each aspect in a sentence with multiple aspects. Previous works mainly employed Long Short-Term Memory (LSTM) and Attention mechanisms to fuse information between aspects and sentences, or to improve large language models such as BERT to adapt aspect-level sentiment classification tasks. The former methods either did not integrate the interactive information of related aspects and sentences, or ignored … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(1 citation statement)
references
References 24 publications
0
1
0
Order By: Relevance
“…The proposed model is evaluated on the system with Windows 10 platform packed with 8GB RAM and 2GB of compute unified device architecture (CUDA) enables NVIDIA graphics. To evaluate the model, accuracy and macro-F1 score is considered as evaluation parameter, also comparative analysis is carried out with the existing BERT model [20] to prove the model's efficiency.…”
Section: Dc-bert Trainingmentioning
confidence: 99%
“…The proposed model is evaluated on the system with Windows 10 platform packed with 8GB RAM and 2GB of compute unified device architecture (CUDA) enables NVIDIA graphics. To evaluate the model, accuracy and macro-F1 score is considered as evaluation parameter, also comparative analysis is carried out with the existing BERT model [20] to prove the model's efficiency.…”
Section: Dc-bert Trainingmentioning
confidence: 99%