2022
DOI: 10.1109/tnnls.2021.3056664
|View full text |Cite
|
Sign up to set email alerts
|

Attention-Emotion-Enhanced Convolutional LSTM for Sentiment Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
40
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
4
1

Relationship

1
9

Authors

Journals

citations
Cited by 98 publications
(40 citation statements)
references
References 62 publications
0
40
0
Order By: Relevance
“…In [60], the authors introduced a transformer-based model for emotion detection that encodes representation from a transformer and applies deep embedding to improve the quality of tweets. In [61], the authors introduced an attention-based deep method using two independent layers. By having to consider temporal information flow in two directions, it will retrieve both past and future contexts.…”
Section: Related Workmentioning
confidence: 99%
“…In [60], the authors introduced a transformer-based model for emotion detection that encodes representation from a transformer and applies deep embedding to improve the quality of tweets. In [61], the authors introduced an attention-based deep method using two independent layers. By having to consider temporal information flow in two directions, it will retrieve both past and future contexts.…”
Section: Related Workmentioning
confidence: 99%
“…The most important concept of LSTM is the cell state. The cell state is the mechanism to transfer information throughout the network [62], the cell state is the memory of the network, which allows it to remember or forget information. This information is added or removed by three gates, namely the input, output, and forget gates, defined by Equations (1)-( 3)), respectively.…”
Section: Long-short-term Memory (Lstm)mentioning
confidence: 99%
“…Due to remembering information for a long time and removing the vanishing gradient problem of RNN, LSTM has appeared to be an effective model in solving problems with sequential data containing long-term dependencies. Some of the examples of LSTM applications are speech recognition [36], machine translation [37], time series forecasting [38,39], and sentiment analysis [40]. Hochreiter and Schmidhuber [41] in 1997 first proposed the LSTM model in which each LSTM unit contains only input and output gates.…”
Section: Long Short-term Memory (Lstm)mentioning
confidence: 99%