Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2022
DOI: 10.1007/s40095-022-00480-x
|View full text |Cite
|
Sign up to set email alerts
|

Deep learning-based load forecasting considering data reshaping using MATLAB\Simulink

Abstract: Load forecasting is a nonlinear problem and complex task that plays a key role in power system planning, operation, and control. A recent study proposed a deep learning approach called historical data augmentation (HDA) to improve the accuracy of the load forecasting model by dividing the input data into several yearly sub-datasets. When the original data is associated with high time step changes from 1 year to another, the approach was not found as effective as it should be for long-term forecasting because t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 44 publications
0
7
0
Order By: Relevance
“…An RNN architecture consists of neural networks with feedback. RNN has the capability of reaching the history value using the present scenario, but there exist gradient issues that vanish and explode [37]. The weight computation and updating in the training process are performed in proportion to the gradient of the error concerning the weight.…”
Section: Implementation Of Recurrent Neural Networkmentioning
confidence: 99%
See 4 more Smart Citations
“…An RNN architecture consists of neural networks with feedback. RNN has the capability of reaching the history value using the present scenario, but there exist gradient issues that vanish and explode [37]. The weight computation and updating in the training process are performed in proportion to the gradient of the error concerning the weight.…”
Section: Implementation Of Recurrent Neural Networkmentioning
confidence: 99%
“…LSTM is capable of holding long-term dependencies for a short duration [37]. The main working ideas of the LSTM are the ability of the delay cells to hold the data and the ability of the information flow gate to permits the information flow to and from the memory units.…”
Section: Long Short-term Memory (Lstm)mentioning
confidence: 99%
See 3 more Smart Citations