2022
DOI: 10.1016/j.knosys.2022.108586
|View full text |Cite
|
Sign up to set email alerts
|

Aspect-level sentiment classification based on attention-BiLSTM model and transfer learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 35 publications
(4 citation statements)
references
References 18 publications
0
3
0
Order By: Relevance
“…Model transfer is usually applied in multitask learning, by applying task data to multiple associated sub-tasks, and using shared modules to learn the association between tasks to obtain additional useful information. In specific applications, Xu [20] proposed a pre-training + multi-task learning model. By using the shared module BiLSTM to first train the document-level data set to obtain the pre-training weight, and reserve the weight as the initialization parameter of the shared part of the model, and secondly, enter the aspect-level data into the pre-trained model to train two task to eventually fine-tune these weights.…”
Section: B Transfer Learningmentioning
confidence: 99%
See 2 more Smart Citations
“…Model transfer is usually applied in multitask learning, by applying task data to multiple associated sub-tasks, and using shared modules to learn the association between tasks to obtain additional useful information. In specific applications, Xu [20] proposed a pre-training + multi-task learning model. By using the shared module BiLSTM to first train the document-level data set to obtain the pre-training weight, and reserve the weight as the initialization parameter of the shared part of the model, and secondly, enter the aspect-level data into the pre-trained model to train two task to eventually fine-tune these weights.…”
Section: B Transfer Learningmentioning
confidence: 99%
“…To be able to flexibly apply data knowledge in the document level, Chen [34] proposed a model based on transfer capsules. Different from the training method in [20], Chen adopts a multi-task learning method of heterogeneous datasets [22], directly puts rich document-level datasets and aspect-level datasets into the model at the same time, and dynamically optimizes the shared parameters. Finally, on the upper layer of the model, semantic capsules and dynamic routing are combined with the acquired transfer knowledge.…”
Section: B Transfer Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…Guixian Xu et al [13] propose an aspect-level sentiment classification model based on Attention-Bidirectional Long Short-Term Memory (Attention-BiLSTM) model and transfer learning. The authors propose three models, including Pre-training (PRET), Multitask learning (MTL), and Pretraining and multitask learning (PRET+MTL), which transfer the knowledge obtained from document-level training of sentiment classification to aspect-level sentiment classification.…”
mentioning
confidence: 99%