2019
DOI: 10.3390/en12163199
|View full text |Cite
|
Sign up to set email alerts
|

Research on Short-Term Load Prediction Based on Seq2seq Model

Abstract: Electricity load prediction is the primary basis on which power-related departments to make logical and effective generation plans and scientific scheduling plans for the most effective power utilization. The perpetual evolution of deep learning has recommended advanced and innovative concepts for short-term load prediction. Taking into consideration the time and nonlinear characteristics of power system load data and further considering the impact of historical and future information on the current state, thi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
19
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 44 publications
(19 citation statements)
references
References 21 publications
0
19
0
Order By: Relevance
“…Seq2seq models: The sequence-to-sequence models were initially used in NLP, but their applicability has spread to almost any time-series forecasting problem. Until now, these models have been little applied to STLF, and their results have been good for very short-term forecasts [ 52 ].…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Seq2seq models: The sequence-to-sequence models were initially used in NLP, but their applicability has spread to almost any time-series forecasting problem. Until now, these models have been little applied to STLF, and their results have been good for very short-term forecasts [ 52 ].…”
Section: Methodsmentioning
confidence: 99%
“…The ML and DMD models expect a sequence of scalar values (longitudinal data) as input; the way to transform the input data for these models is to flatten the vectors over all time-steps. The DL models can receive vector-valued inputs, i.e., both LSTM [ 52 ] and 1D/2D CNN [ 51 ] models can receive a vector-valued sequence (with length p) where each timestep is represented by a vector of values. When the first layer of the DL model is an FC layer, the input must be formatted as a vector (flattened), and when the first layer is a 2D-convolutional (2D-conv) layer, the data must be formatted as a matrix by packing the predictors (in columns) for all past time-steps (in rows).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…A sequence-tosequence RNN was developed with an attention mechanism for the electric load forecast in recent research [15], and a similar sample generation method was designed. In [16], A LSTM-based short-term load forecast model with two mechanisms is built. Their approach is similar to the language translation model in [2], where one LSTM is used to encode the input sequence into a fixed vector, and then need separate LSTM to decode the vector to a sequence of outputs with an attention mechanism to learn weight.…”
Section: Related Workmentioning
confidence: 99%
“…The method successfully captures the dependencies between these sequences. A Seq2seq short-term load forecast model based on LSTM is developed in [16]. Two deep learning methods were proposed for electric load forecasting in [17].…”
Section: Introductionmentioning
confidence: 99%