2022 14th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC) 2022
DOI: 10.1109/ihmsc55436.2022.00016
|View full text |Cite
|
Sign up to set email alerts
|

Aspect-level sentiment analysis based on BERT fusion multi-attention

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 5 publications
0
2
0
Order By: Relevance
“…In this way, CNN can be used to analyze the important features in the sentence and the aspect emotion in the input sentence. 11) ASBABM [16]: The model implements the interaction between aspect words and context through aspect attention and self-attention mechanisms, and then the BiLSTM network enhances the dependency between sentences and aspect words.…”
Section: ) Capsar [12]: This Model Injects Aspect Information Into Th...mentioning
confidence: 99%
See 1 more Smart Citation
“…In this way, CNN can be used to analyze the important features in the sentence and the aspect emotion in the input sentence. 11) ASBABM [16]: The model implements the interaction between aspect words and context through aspect attention and self-attention mechanisms, and then the BiLSTM network enhances the dependency between sentences and aspect words.…”
Section: ) Capsar [12]: This Model Injects Aspect Information Into Th...mentioning
confidence: 99%
“…CNN over BERT-GCN [15] proposed to use the combination of BERT and BiLSTM to take into account the useful information hidden in context, and exploit the GCN model with multiple convolutional layers to capture the context features. ASBABM [16] proposed an aspect-level sentiment analysis model based on BERT and multi-level attention mechanisms. The model captures the interaction and correlation between the aspect word and the entire text through both the aspect attention mechanism and the selfattention mechanism.…”
Section: Introductionmentioning
confidence: 99%