2019
DOI: 10.3390/app9183717
|View full text |Cite
|
Sign up to set email alerts
|

Lexicon-Enhanced Attention Network Based on Text Representation for Sentiment Classification

Abstract: Text representation learning is an important but challenging issue for various natural language processing tasks. Recently, deep learning-based representation models have achieved great success for sentiment classification. However, these existing models focus on more semantic information rather than sentiment linguistic knowledge, which provides rich sentiment information and plays a key role in sentiment analysis. In this paper, we propose a lexicon-enhanced attention network (LAN) based on text representati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(8 citation statements)
references
References 36 publications
(61 reference statements)
0
8
0
Order By: Relevance
“…In [16], they propose a lexicon-enhanced attention network (LAN) based on text representation to improve the classification of sentiments. Combining the sentiment lexicon with attention mechanism in the word embedding module, they can obtain the sentiment-aware word embeddings as the input of the deep neural network, which bridges the gap between sentiment linguistic knowledge and deep learning methods.…”
Section: Related Work 21 Sentiment Analysis In Social Mediamentioning
confidence: 99%
“…In [16], they propose a lexicon-enhanced attention network (LAN) based on text representation to improve the classification of sentiments. Combining the sentiment lexicon with attention mechanism in the word embedding module, they can obtain the sentiment-aware word embeddings as the input of the deep neural network, which bridges the gap between sentiment linguistic knowledge and deep learning methods.…”
Section: Related Work 21 Sentiment Analysis In Social Mediamentioning
confidence: 99%
“…Liu et al 19 proposed a co-attention mechanism to capture the semantic correlations and generate more excellent feature representation. Li et al 20 proposed a lexicon-enhanced network by combing the sentiment lexicon and attention mechanism and incorporating sentiment linguistic knowledge into the deep learning methods. Meb et al 21 proposed an attention-based model named ABCDM (An attention-based bidirectional CNN-RNN deep model) considering features with different importance, which extract temporal information by utilizing long short term memory (LSTM) and gated recurrent unit layers.…”
Section: Attention Mechanismmentioning
confidence: 99%
“…In the field of NLP, most of the mainstream text classification algorithms are based on convolutional neural network (CNN) and recurrent neural network (RNN) [26][27][28][29]. As a variant of RNN, long short-term memory (LSTM) solves the problem of the long-term dependence of the RNN model and the problem of gradient disappearance and explosion caused by a too long sequence.…”
Section: Network Architecture Of Lstmmentioning
confidence: 99%