Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2018
DOI: 10.18653/v1/p18-1087
|View full text |Cite
|
Sign up to set email alerts
|

Transformation Networks for Target-Oriented Sentiment Classification

Abstract: Target-oriented sentiment classification aims at classifying sentiment polarities over individual opinion targets in a sentence. RNN with attention seems a good fit for the characteristics of this task, and indeed it achieves the state-of-the-art performance. After re-examining the drawbacks of attention mechanism and the obstacles that block CNN to perform well in this classification task, we propose a new model to overcome these issues. Instead of attention, our model employs a CNN layer to extract salient f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
216
0
2

Year Published

2019
2019
2020
2020

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 398 publications
(218 citation statements)
references
References 29 publications
0
216
0
2
Order By: Relevance
“…In one of the previous studies, Transformation network [180], [181] also exploited the idea of attention in a simple way by incorporating it with convolution block, but the main problem was that attention modules in Transformation network are fixed and cannot adapt to changing circumstances. RAN was made efficient towards recognition of cluttered, complex, and noisy images by stacking multiple attention modules.…”
Section: Residual Attention Neural Networkmentioning
confidence: 99%
“…In one of the previous studies, Transformation network [180], [181] also exploited the idea of attention in a simple way by incorporating it with convolution block, but the main problem was that attention modules in Transformation network are fixed and cannot adapt to changing circumstances. RAN was made efficient towards recognition of cluttered, complex, and noisy images by stacking multiple attention modules.…”
Section: Residual Attention Neural Networkmentioning
confidence: 99%
“…• HAST-TNet: HAST (Li et al 2018b) and TNet (Li et al 2018a) are the current state-of-the-art models on the tasks of target boundary detection and target sentiment classification respectively. HAST-TNet is the pipline approach of these two models.…”
Section: Compared Modelsmentioning
confidence: 99%
“…As CNN can capture the informative n-grams features, convolutional memory networks were explored in [18] to incorporate an attention mechanism to sequentially compute the weights of multiple memory units corresponding to multi-words. Instead of attention networks, [5] proposed a component to generate target-specific representations for words, and employed a CNN layer as the feature extractor relying on a mechanism of preserving the original contextual information. Some other works [20] exploited human reading cognitive process for this task.…”
Section: Deep Methodsmentioning
confidence: 99%
“…A range of attention mechanisms are introduced to address this issue, such as target-to-sentence attention [2], fine-grained word-level attention [3], and multiple attentions [4]. Convolutional neural network (CNN)-based models are also recently used for this task because of the capability to extract the informative n-grams features [5]. All the aforementioned methods focus on exploiting labeled data to build the classification model, whose performance is often largely limited.…”
Section: Introductionmentioning
confidence: 99%