2023
DOI: 10.1016/j.egyr.2023.09.175
|View full text |Cite
|
Sign up to set email alerts
|

Comparing Long Short-Term Memory (LSTM) and bidirectional LSTM deep neural networks for power consumption prediction

Davi Guimarães da Silva,
Anderson Alvarenga de Moura Meneses
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 27 publications
(8 citation statements)
references
References 52 publications
0
8
0
Order By: Relevance
“…Compared to the conventional unidirectional LSTM, the Bi-LSTM’s dual-directional learning framework enables the model to encapsulate a comprehensive temporal context, thereby enhancing its predictive accuracy for sequential data recovery tasks. References substantiating this preference include seminal works that highlight the superiority of Bi-LSTMs over single-direction LSTMs in various sequential data processing tasks [ 45 , 46 ]. Upon scrutinizing the outcomes presented in Figure 5 , the effective impact of the U-net-inspired skip connections and the adaptive Huber loss function in the proposed Bi-LSTM autoencoder is evident.…”
Section: Resultsmentioning
confidence: 99%
“…Compared to the conventional unidirectional LSTM, the Bi-LSTM’s dual-directional learning framework enables the model to encapsulate a comprehensive temporal context, thereby enhancing its predictive accuracy for sequential data recovery tasks. References substantiating this preference include seminal works that highlight the superiority of Bi-LSTMs over single-direction LSTMs in various sequential data processing tasks [ 45 , 46 ]. Upon scrutinizing the outcomes presented in Figure 5 , the effective impact of the U-net-inspired skip connections and the adaptive Huber loss function in the proposed Bi-LSTM autoencoder is evident.…”
Section: Resultsmentioning
confidence: 99%
“…This is caused by the advantages of Bi-LSTM where BLSTM allows information flow in both directions, adding a new LSTM layer that inverts the sequence, and the outputs of both layers are combined, for example, with average, sum, multiplication, or concatenation. The possibility of two flow directions enables a better learning process [16].…”
Section: B Discussionmentioning
confidence: 99%
“…It can maintain its state over time, as well as a nonlinear gating unit, which regulates the flow of information in and out of the unit. 19 Specifically, the LSTM model consists of the following four main components:…”
Section: Experimental and Machine Learningmentioning
confidence: 99%
“…Among them, LSTM (Long Short-Term Memory) is a particular type of RNN (Recurrent Neural Networks) and is a powerful tool that mitigates the long term memory problem and vanishing problem, which appear to be tricky issues in RNNs. 18,19 It has become an effective and scalable model for solving several learning problems related to sequential data. The core idea of LSTM is to replace the summation unit in the hidden layer by introducing a storage unit.…”
Section: Ensemble Hybrid Model (Eh)mentioning
confidence: 99%