2022
DOI: 10.14569/ijacsa.2022.0131048
|View full text |Cite
|
Sign up to set email alerts
|

Comparing LSTM and CNN Methods in Case Study on Public Discussion about Covid-19 in Twitter

Abstract: This study compares two Deep Learning model methods, which include the Long Short-Term Memory (LSTM) method and the Convolution Neural Network (CNN) method. The aim of the comparison is to discover the performance of two different fundamental deep learning approaches which are based on convolutional theory (CNN) and deal with the vanishing gradient problem (LSTM). The purpose of this study is to compare the accuracy of the two methods using a dataset of 4169 obtained by crawling social media using the Twitter … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 26 publications
(35 reference statements)
0
1
0
Order By: Relevance
“…The LSTM model is a development of the recurrent neural network (RNN) model to overcome vanishing gradient or exploding gradient problems [16]. With the existence of 3 gates, which include the input gate, the forget gate and the output gate, it can function to control the flow of information in and out of the memory cell [17]. After using LSTM, it is continued with the addition of an attention mechanism layer to improve the quality of predictions or outputs produced by focusing on the most influential parts of the final result [18].…”
Section: Model Buildingmentioning
confidence: 99%
“…The LSTM model is a development of the recurrent neural network (RNN) model to overcome vanishing gradient or exploding gradient problems [16]. With the existence of 3 gates, which include the input gate, the forget gate and the output gate, it can function to control the flow of information in and out of the memory cell [17]. After using LSTM, it is continued with the addition of an attention mechanism layer to improve the quality of predictions or outputs produced by focusing on the most influential parts of the final result [18].…”
Section: Model Buildingmentioning
confidence: 99%