Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing 2017
DOI: 10.18653/v1/d17-1047
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent Attention Network on Memory for Aspect Sentiment Analysis

Abstract: We propose a novel framework based on neural networks to identify the sentiment of opinion targets in a comment/review. Our framework adopts multiple-attention mechanism to capture sentiment features separated by a long distance, so that it is more robust against irrelevant information. The results of multiple attentions are non-linearly combined with a recurrent neural network, which strengthens the expressive power of our model for handling more complications. The weightedmemory mechanism not only helps us a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
466
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
3
3

Relationship

0
9

Authors

Journals

citations
Cited by 816 publications
(505 citation statements)
references
References 16 publications
(40 reference statements)
1
466
0
Order By: Relevance
“…The model makes embeddings of aspects to participate in computing attention weights. RAM is proposed by Chen et al [10] which adopts multiple-attention mechanism on the memory built with bidirectional LSTM. Ma et al [11] design a model with the bidirectional attention mechanism, which interactively learns the attention weights on context and aspect words respectively.…”
Section: Aspect-level Sentiment Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…The model makes embeddings of aspects to participate in computing attention weights. RAM is proposed by Chen et al [10] which adopts multiple-attention mechanism on the memory built with bidirectional LSTM. Ma et al [11] design a model with the bidirectional attention mechanism, which interactively learns the attention weights on context and aspect words respectively.…”
Section: Aspect-level Sentiment Classificationmentioning
confidence: 99%
“…In recent years, neural network models [5,6] are of growing interest for their capacity to automatically generate useful low dimensional representations from aspects and their contexts, and achieve great accuracy on the aspect-level sentiment classification without careful engineering of features. Especially, by the ability to effectively identify which words in the sentence are more important on a given aspect, attention mechanisms [7,8] implemented by neural networks are widely used in aspect-level sentiment classification [9][10][11][12][13][14]. Chen et al [10] model a multiple attention Email addresses: pinlongzhao@tju.edu.cn (Pinlong Zhao), llhou@mail.nankai.edu.cn (Linlin Hou), wuou@tju.edu.cn (Ou Wu) The setting is romantic, but the food is horrible, the service is pathetic.…”
Section: Introductionmentioning
confidence: 99%
“…the combination of semantic relatedness and syntactic proximity, but we get unexpected sub-optimal results, which will be shown in experiments section. 3 With spaCy toolkit: https://spacy.io/. 4…”
Section: Position Proximitymentioning
confidence: 99%
“…Traditional machine learning methods for aspectbased sentiment analysis focus on extracting a set of features to train sentiment classifiers (Ding et al, 2009;Boiy and Moens, 2009;Jiang et al, 2011), which usually are labor intensive. With the development of deep learning technologies, neural attention mechanism (Bahdanau et al, 2014) has been widely adopted to address this task (Tang et al, 2015;Wang et al, 2016;Tang et al, 2016;Ma et al, 2017;Chen et al, 2017;Cheng et al, 2017;Li et al, 2018a;Wang et al, 2018a;Tay et al, 2018;Hazarika et al, 2018;Majumder et al, 2018;Fan et al, 2018;Wang et al, 2018b). Wang et al (2016) propose attention-based LSTM networks which attend on different parts of the sentence for different aspects.…”
Section: Related Workmentioning
confidence: 99%