2017
DOI: 10.1007/978-3-319-64206-2_9
|View full text |Cite
|
Sign up to set email alerts
|

Sentiment Analysis with Tree-Structured Gated Recurrent Units

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 9 publications
0
7
0
Order By: Relevance
“…Content may change prior to final publication. Computation of the recall for multi-class classification is made according to (13). Where tp i is the number of predicted instances are labeled class i,i is the index of the class , l is the total number of class labels, and tp i + f n i is the total number of instances labeled class label i in the given dataset.…”
Section: F Alsen Egativeratementioning
confidence: 99%
See 2 more Smart Citations
“…Content may change prior to final publication. Computation of the recall for multi-class classification is made according to (13). Where tp i is the number of predicted instances are labeled class i,i is the index of the class , l is the total number of class labels, and tp i + f n i is the total number of instances labeled class label i in the given dataset.…”
Section: F Alsen Egativeratementioning
confidence: 99%
“…The F1-Score is computed by applying (20) for multi-class classification. Where P recision i is computed using the formula (18), and Recall i is calculated using the formula (13).…”
Section: F Alsen Egativeratementioning
confidence: 99%
See 1 more Smart Citation
“…In this regard, Tai et al [33] employed Long Short Term Memory (LSTM) network integrated with some complex units for sentiment analysis. Kuta et al [34] proposed tree structure gated recurrent neural network which was inspired by tree structure LSTM and adaptation of Gated Recurrent Unit (GRU) to recursive model. Besides these networks, a semi-supervised model, known as the Recursive Neural Network (ReRNN), has been also employed for the task of sentiment analysis which uses continuous word vectors as input and hierarchical structure.…”
Section: B Deep Neural Networkmentioning
confidence: 99%
“…Autoencoders have been applied to many tasks such as probabilistic and generative modeling [13] and representation learning [14]. A GRU is a Recurrent Neural Network (RNN) that has been used for tasks such as sentiment analysis [15] and speech recognition [16]. The proposed model is evaluated using three labeled log message data sets, namely BlueGene/L (BGL), OpenStack, and Thunderbird.…”
Section: Introductionmentioning
confidence: 99%