2019
DOI: 10.1007/s11063-019-10017-9
|View full text |Cite
|
Sign up to set email alerts
|

Multi-layer Attention Based CNN for Target-Dependent Sentiment Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
21
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 36 publications
(21 citation statements)
references
References 34 publications
0
21
0
Order By: Relevance
“…Yang et al [18] applied an attention mechanism to bidirectional LSTM in two models for target-dependent sentiment classification of a Twitter dataset and achieved improvement over baseline techniques. Zhang et al [19] also approached target-dependent sentiment classification using a multi-layer CNN with an attention mechanism that modeled context representation and achieved high accuracy on product reviews and Twitter data. Gan et al [20] designed a dilated CNN based on sparse attention to perform targeted sentiment analysis, which identified sentiment orientation and handled complex sequences.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Yang et al [18] applied an attention mechanism to bidirectional LSTM in two models for target-dependent sentiment classification of a Twitter dataset and achieved improvement over baseline techniques. Zhang et al [19] also approached target-dependent sentiment classification using a multi-layer CNN with an attention mechanism that modeled context representation and achieved high accuracy on product reviews and Twitter data. Gan et al [20] designed a dilated CNN based on sparse attention to perform targeted sentiment analysis, which identified sentiment orientation and handled complex sequences.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Although RNN is suitable for processing time-series information, it has the problem of gradient explosion or gradient disappearance [34], [35]. To solve this problem, LSTM introduces a gating mechanism into the traditional network structure of RNN to support the long-term dependence of sequence information [36], [37]. In natural language tasks, since the text itself has the property of sequence, it is very suitable to use LSTM for processing theoretically.…”
Section: Related Workmentioning
confidence: 99%
“…Meanwhile, attention mechanism is an effective method, and its development renders the outputs of models more interpretable. Recently, they have often been combined with deep learning methods and successfully applied to multiple fields [22,33,35,40].…”
Section: Introductionmentioning
confidence: 99%