2018 First Asian Conference on Affective Computing and Intelligent Interaction (ACII Asia) 2018
DOI: 10.1109/aciiasia.2018.8470378
|View full text |Cite
|
Sign up to set email alerts
|

LSTM-based Text Emotion Recognition Using Semantic and Emotional Word Vectors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 39 publications
(18 citation statements)
references
References 6 publications
0
18
0
Order By: Relevance
“…Well‐known pretrained word embeddings such as Word2Vec, 44 GloVe, 45 and FastText 46 are used as input term weights 42,47 . Unlike RNN, one‐ or two‐dimensional convolution processes are performed by using different filter sizes in CNN architecture 7,41,42,47‐49 . Furthermore, in a recent pretrained language model‐based approach called as BERT, 50 is used to figure out the input representation on emotion analysis task in the study of Reference 4 .…”
Section: Literature Summarymentioning
confidence: 99%
See 1 more Smart Citation
“…Well‐known pretrained word embeddings such as Word2Vec, 44 GloVe, 45 and FastText 46 are used as input term weights 42,47 . Unlike RNN, one‐ or two‐dimensional convolution processes are performed by using different filter sizes in CNN architecture 7,41,42,47‐49 . Furthermore, in a recent pretrained language model‐based approach called as BERT, 50 is used to figure out the input representation on emotion analysis task in the study of Reference 4 .…”
Section: Literature Summarymentioning
confidence: 99%
“…The researchers combined the predictions produced by weak classifiers in their study 36 . In all of those DL‐based studies, the highest observed classification performance is ranging between 53% 3 and 70% 42 …”
Section: Literature Summarymentioning
confidence: 99%
“…It is the implicit state of the previous moment 1 t h − and the input at the current moment t X . The output of the memory gate t i and temporary cell state ~t C is calculated by the following formula [20] [24].…”
Section: Commentsmentioning
confidence: 99%
“…59-69 Pokhun & Chuttur (Analyzing emotions in texts) 58.6% for general tweets. Another method consisting of the use of LSTM on text emotion recognition was further proposed by Su et al (2018). The method was tested against Natural Language Processing and Chinese Computing (NLPCC) database which contains seven emotion categories: anger, boredom, disgust, anxiety, happiness, sadness, surprise and results obtained indicated an accuracy of 70.66%.…”
Section: Yasmina Et Al Used Point Wise Mutual Information (Pmi) To Cmentioning
confidence: 99%