Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing 2016
DOI: 10.18653/v1/d16-1021
|View full text |Cite
|
Sign up to set email alerts
|

Aspect Level Sentiment Classification with Deep Memory Network

Abstract: We introduce a deep memory network for aspect level sentiment classification. Unlike feature-based SVM and sequential neural models such as LSTM, this approach explicitly captures the importance of each context word when inferring the sentiment polarity of an aspect. Such importance degree and text representation are calculated with multiple computational layers, each of which is a neural attention model over an external memory. Experiments on laptop and restaurant datasets demonstrate that our approach perfor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
519
0
4

Year Published

2017
2017
2021
2021

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 845 publications
(555 citation statements)
references
References 26 publications
2
519
0
4
Order By: Relevance
“…• MemNet (Tang et al, 2016): It applies attention multiple times on the word embeddings, and the last attention's output is fed to softmax for prediction, without combining the results of different attentions. We produce its results on all four datasets with the code released by the authors.…”
Section: Compared Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…• MemNet (Tang et al, 2016): It applies attention multiple times on the word embeddings, and the last attention's output is fed to softmax for prediction, without combining the results of different attentions. We produce its results on all four datasets with the code released by the authors.…”
Section: Compared Methodsmentioning
confidence: 99%
“…Single attention or multiple attentions were applied in aspect sentiment classification in some previous works (Wang et al, 2016;Tang et al, 2016). One difference between our method and (Tang et al, 2016) is that we introduce a memory module between the attention module and the input module, thus our method can synthesize features of word sequences such as sentiment phrases (e.g. "not wonderful enough").…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations