2020
DOI: 10.1109/access.2020.2997675
|View full text |Cite
|
Sign up to set email alerts
|

Enhancing Aspect-Based Sentiment Analysis With Capsule Network

Abstract: Existing feature-based neural approaches for aspect-based sentiment analysis (ABSA) try to improve their performance with pre-trained word embeddings and by modeling the relations between the text sequence and the aspect (or category), thus heavily depending on the quality of word embeddings and task-specific architectures. Although the recently pre-trained language models, i.e., BERT and XLNet, have achieved state-of-the-art performance in a variety of natural language processing (NLP) tasks, they still subje… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 29 publications
(7 citation statements)
references
References 24 publications
0
7
0
Order By: Relevance
“…Techniques such as layer normalization [261], residual connections [262], and attention mechanisms inherent in the transformer architecture further enhance BERT's Net [263] is a generalized autoregressive [264] pretraining method that surpasses the limitations of traditional left-to-right or right-to-left language modeling. XLNet is trained using a permutation-based approach that differs from traditional autoregressive models [265].…”
Section: B Training Of Llmsmentioning
confidence: 99%
“…Techniques such as layer normalization [261], residual connections [262], and attention mechanisms inherent in the transformer architecture further enhance BERT's Net [263] is a generalized autoregressive [264] pretraining method that surpasses the limitations of traditional left-to-right or right-to-left language modeling. XLNet is trained using a permutation-based approach that differs from traditional autoregressive models [265].…”
Section: B Training Of Llmsmentioning
confidence: 99%
“…The experimental results on six classification tasks indicate the effectiveness of the model and the algorithm helps to reduce the interference among tasks. Su et al [8] combined XLNet and capsule network to address the challenge of aspect-based sentiment analysis (ABSA); a capsule network with a dynamic routing algorithm was utilized to extract the local and spatial hierarchical relations of the text sequence and yield its local feature representations. Lin et al [9] first employed the conventional source-target attention to produce a timestep-specific sourceside context vector and fed the vector into a novel dynamic context-guided capsule network (DCCN) for multimodal machine translation (MMT), which achieved superior results on Multi30K dataset.…”
Section: Capsule Neural Network For Nlpmentioning
confidence: 99%
“…Simple attention networks can generate noisy features as well. This problem can be resolved using Capsule networks [31,32]. The capsule network can dynamically route the spatial features from the lower layer to the upper layers.…”
Section: Recent Trends In Aspect Level Sentiment Classificationmentioning
confidence: 99%