2020 International Conference on Computational Performance Evaluation (ComPE) 2020
DOI: 10.1109/compe49325.2020.9200054
|View full text |Cite
|
Sign up to set email alerts
|

Long Short Term Memory (LSTM) based Deep Learning for Sentiment Analysis of English and Spanish Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(6 citation statements)
references
References 0 publications
0
6
0
Order By: Relevance
“…Given that each review of a product or service has a unique language, a single model may not be sufficient to accurately categorize the other data. These findings show that sentiment analysis is very context-and subject-dependent [3].…”
Section: Lstm For Sentiment Analysismentioning
confidence: 69%
See 1 more Smart Citation
“…Given that each review of a product or service has a unique language, a single model may not be sufficient to accurately categorize the other data. These findings show that sentiment analysis is very context-and subject-dependent [3].…”
Section: Lstm For Sentiment Analysismentioning
confidence: 69%
“…For every data set separately, an LSTM-based RNN model was created, and the results were good [3]. LSTM with a hidden layer, a dense layer containing a neuron, and a sigmoid activation function are the embeddings.…”
Section: Lstm For Sentiment Analysismentioning
confidence: 99%
“…LSTM networks, a specialized form of recurrent neural networks (RNNs), were introduced in 1997 [11] and are designed to emulate the learning processes of the human brain by retaining memory of previous inputs and discerning pivotal elements from non-essential ones [12]. LSTMs have found applications in a myriad of fields, including dynamic system modeling [13], image processing, speech recognition, sentiment analysis [14], and time series prediction [15]. An archetypal LSTM network, as depicted in Figure 1 [16], is composed of three distinct gates: the input gate, forget gate, and output gate.…”
Section: Long Short-term Memorymentioning
confidence: 99%
“…The Tanh activation function, which maps the data between -1 and 1, receives the two values in the second section. The input gate's output, which modifies the cell state determined by multiplying the outputs of the Tanh and Sigma functions [43,44].…”
Section: ) Lstmmentioning
confidence: 99%