2021
DOI: 10.3390/app112311255
|View full text |Cite
|
Sign up to set email alerts
|

Attention-Based CNN and Bi-LSTM Model Based on TF-IDF and GloVe Word Embedding for Sentiment Analysis

Abstract: Sentiment analysis (SA) detects people’s opinions from text engaging natural language processing (NLP) techniques. Recent research has shown that deep learning models, i.e., Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), and Transformer-based provide promising results for recognizing sentiment. Nonetheless, CNN has the advantage of extracting high-level features by using convolutional and max-pooling layers; it cannot efficiently learn a sequence of correlations. At the same time, Bidirect… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 54 publications
(35 citation statements)
references
References 53 publications
(83 reference statements)
0
24
0
Order By: Relevance
“…We collected this dataset about Afghanistan’s ( ) security status from 29/03/2018 to 21/06/2018. The dataset includes 56 k tweets with positive, negative, and neutral classification ( Kamyab et al, 2018 ; Kamyab, Liu & Adjeisah, 2021 ). However, we consider only positive and negative tweets from this dataset, given that this work explores binary classification problems.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…We collected this dataset about Afghanistan’s ( ) security status from 29/03/2018 to 21/06/2018. The dataset includes 56 k tweets with positive, negative, and neutral classification ( Kamyab et al, 2018 ; Kamyab, Liu & Adjeisah, 2021 ). However, we consider only positive and negative tweets from this dataset, given that this work explores binary classification problems.…”
Section: Methodsmentioning
confidence: 99%
“…Currently, the sentiment140 dataset is one of the mostly used and standard datasets for text classification, with 1,048,576 tweets automatically categorized into 248,576 positive and 80,000 negatives. It has been employed in various SOTA studies ( Song et al, 2020 ; Rezaeinia et al, 2019 ; Dos Santos & Gatti, 2014 ; Wang et al, 2016a ; Dang, Moreno-García & De la Prieta, 2020 ; Nguyen & Nguyen, 2019 ; Basiri et al, 2021 ; Alec & Richa Bhayani, 2009 ; Kamyab, Liu & Adjeisah, 2021 ).…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…In addition, recurrent neural networks RNN does not have long-term memory that it can only work on short interval data which limit the training cannot adapt data with long interval or the data with long duration [5], [12]. To conquer the longterm problem in recurrent neural networks RNN, there is a variation of RNN named long short-term memory (LSTM) [20]. This model realizes the capability of learning long-term dependencies by implementing and optimizing the model of RNN.…”
Section: Training Phasementioning
confidence: 99%