2020 International Conference on Computational Performance Evaluation (ComPE) 2020
DOI: 10.1109/compe49325.2020.9200062
|View full text |Cite
|
Sign up to set email alerts
|

LSTM based Deep RNN Architecture for Election Sentiment Analysis from Bengali Newspaper

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
3
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(7 citation statements)
references
References 7 publications
0
3
0
1
Order By: Relevance
“…Bengali comments, text, etc., are quite unofficial data as far need much moderation [12]. Again, for classifying political news, problem will arise dealing with limited terms [13]. Comparing those, our proposed model works much efficiently as it deals with 10 categories, preprocessed data with 'GloVe', filter-based analogy using advanced features of LSTM.…”
Section: Resultsmentioning
confidence: 99%
“…Bengali comments, text, etc., are quite unofficial data as far need much moderation [12]. Again, for classifying political news, problem will arise dealing with limited terms [13]. Comparing those, our proposed model works much efficiently as it deals with 10 categories, preprocessed data with 'GloVe', filter-based analogy using advanced features of LSTM.…”
Section: Resultsmentioning
confidence: 99%
“…RNN dapat menggunakan keadaan internal (memori) untuk memproses urutan input. Hal ini berlaku dalam (NLP) [15], pengenalan suara (speech recognition) [25], music synthesis [27], dan pemrosesan data keuangan [28]. Perhitungan RNN ditunjukkan pada Persamaan 2 dan Persamaan 3 dibawah ini.…”
Section: Tf-idfunclassified
“…Nilai recall dapat dihitung menggunakan Persamaan 7. [24] 82,48% Naïve Bayes [24] 76,56% Naive Bayes [25] 75,58% SVM [25] 63,99% KNN [25] 73,34% Naïve Bayes [26] 74,2% SVM [26] 81,2% Random Forest [26] 72,5% RNN [27] 91,9% RNN [28] 85% RNN [29] 95% Naïve Bayes (TF-IDF) 80% RNN (TF-IDF) 97,7%…”
Section: Performansimentioning
confidence: 99%
“…Additionally, recurrent neural networks perform well for modeling time series. There have been many achievements in machine translation [37,38], sentiment analysis [39,40], stock prediction [41], and other fields. Significant temporal correlation of ship flooding fits with recurrent neural networks.…”
Section: Introductionmentioning
confidence: 99%