Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing 2017
DOI: 10.18653/v1/d17-1048
|View full text |Cite
|
Sign up to set email alerts
|

A Cognition Based Attention Model for Sentiment Analysis

Abstract: Attention models are proposed in sentiment analysis because some words are more important than others. However, most existing methods either use local context based text information or user preference information. In this work, we propose a novel attention model trained by cognition grounded eye-tracking data. A reading prediction model is first built using eye-tracking data as dependent data and other features in the context as independent data. The predicted reading time is then used to build a cognition bas… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
40
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3
2
1

Relationship

2
8

Authors

Journals

citations
Cited by 66 publications
(40 citation statements)
references
References 25 publications
(28 reference statements)
0
40
0
Order By: Relevance
“…In literature, many researchers used IMDB dataset with 2 classes for performance testing of proposed algorithms. Results of different techniques for text classification in IMDB dataset are available in [3,5,[23][24][25][26][27][28][29][30][31][32][33][34][35][36][37]. IMDB dataset for web opinion mining is investigated in [5].…”
Section: Related Workmentioning
confidence: 99%
“…In literature, many researchers used IMDB dataset with 2 classes for performance testing of proposed algorithms. Results of different techniques for text classification in IMDB dataset are available in [3,5,[23][24][25][26][27][28][29][30][31][32][33][34][35][36][37]. IMDB dataset for web opinion mining is investigated in [5].…”
Section: Related Workmentioning
confidence: 99%
“…Attention mechanism is also added to LSTM models to highlight important segments at both sentence level and document level. Attention models can be built from text in local context (Yang et al, 2016), user/production information Long et al, 2017a) and other information such as cognition grounded eye tracking data (Long et al, 2017b). LSTM models with attention mechanism are currently the state-of-theart models in document sentiment analysis tasks Long et al, 2017b).…”
Section: Neural Network Modelsmentioning
confidence: 99%
“…Question Encoding A gate recurrent unit (GRU) [7] is used to encode the question embedding, which is widely adopted in NLP and multimodal tasks [17,22,41]. To be specific, given a question with T words Q = [q 1 , ..., q t , ..., q T ], where q t is the one hot vector of the question word at position t, we first embed them into a dense representation via a linear transformation x t = W e q t .…”
Section: Context-aware Visual Attentionmentioning
confidence: 99%