Proceedings of the 11th International Workshop on Semantic Evaluation (SemEval-2017) 2017
DOI: 10.18653/v1/s17-2101
|View full text |Cite
|
Sign up to set email alerts
|

deepSA at SemEval-2017 Task 4: Interpolated Deep Neural Networks for Sentiment Analysis in Twitter

Abstract: In this paper, we describe our system implementation for sentiment analysis in Twitter. This system combines two models based on deep neural networks, namely a convolutional neural network (CNN) and a long short-term memory (LSTM) recurrent neural network, through interpolation. Distributed representation of words as vectors are input to the system, and the output is a sentiment class. The neural network models are trained exclusively with the data sets provided by the organizers of SemEval-2017 Task 4 Subtask… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 7 publications
(12 reference statements)
0
4
0
Order By: Relevance
“…Tzu-Hsuan Yang et al, implemented [7] a system for SA task in twitter dataset. This system is a combination of two deep neural networks based models such as LSTM Recurrent Neural Network and convolution neural network (CNN) through interpolation.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Tzu-Hsuan Yang et al, implemented [7] a system for SA task in twitter dataset. This system is a combination of two deep neural networks based models such as LSTM Recurrent Neural Network and convolution neural network (CNN) through interpolation.…”
Section: Related Workmentioning
confidence: 99%
“…Finally, the high level vector representation of a tweet𝑆is produced based on the weights by using weighted sum of the word annotations. The S is computed using equation (7).…”
Section: B Attention Layer For Sentiment Orientationmentioning
confidence: 99%
“…Each deep learning method has its positive and negative aspects, therefore, researchers have proposed ensemble techniques with CNN, RNN, and LSTM to avoid the drawbacks of each model [33], [34]. In the same fashion, Yang et al [35] interpolated the output of CNN and LSTM by weighting the predictions of both the models. Generally, RNN has the problem of vanishing gradient descent (i.e., the computation in the long distant layers vanishes after a few layers), so Ding et al [36] proposed to directly connect all layers to reduce the effects of gradient descent.…”
Section: Deep Learning Sentiment Classificationmentioning
confidence: 99%
“…We begin with basic pre-processing methods (Yang et al, 2017), e.g. splitting a tweet into word, replacing URLs and USERs with normalization patterns <URL> and <USER>, and converting uppercase letters to lowercase letters.…”
Section: Pre-processingmentioning
confidence: 99%