2019 IEEE Milan PowerTech 2019
DOI: 10.1109/ptc.2019.8810645
|View full text |Cite
|
Sign up to set email alerts
|

Sequence to sequence deep learning models for solar irradiation forecasting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 23 publications
(13 citation statements)
references
References 14 publications
0
10
0
Order By: Relevance
“…As a result, some researches introduced encoder-decoder architecture based models for MTSF, which consists of two RNNs for feature encoding and decoding respectively. For example, Mukhoty et al [36] used two LSTM as the encoder and the decoder respectively to build a seq2seq model for multi-step solar irradiation forecasting and got excellent prediction accuracy. Liang et al [37] demonstrated the superiority of the encoder-decoder architecture over single RNN by comparison experiments on two environment quality datasets.…”
Section: A Related Workmentioning
confidence: 99%
“…As a result, some researches introduced encoder-decoder architecture based models for MTSF, which consists of two RNNs for feature encoding and decoding respectively. For example, Mukhoty et al [36] used two LSTM as the encoder and the decoder respectively to build a seq2seq model for multi-step solar irradiation forecasting and got excellent prediction accuracy. Liang et al [37] demonstrated the superiority of the encoder-decoder architecture over single RNN by comparison experiments on two environment quality datasets.…”
Section: A Related Workmentioning
confidence: 99%
“…For sequence modeling problems, Seq2Seq (Sutskever et al, 2014) is the canonical deep learning framework and although applied this architecture to neural machine translation (NMT) tasks, it has since been adapted to time series forecasting (Nascimento et al, 2019;Yu et al, 2017;Gasparin et al, 2019;Mukhoty et al, 2019;Wen et al, 2017;Salinas et al, 2020;Wen and Torkkola, 2019). The MQ-Forecaster framework (Wen et al, 2017) solves (1) above by treating each series i as a sample from a joint stochastic process and feeding into a neural network which predicts Q quantiles for each horizon.…”
Section: Time Series Forecastingmentioning
confidence: 99%
“…Recent work applying deep learning to time-series forecasting focuses primarily on the use of recurrent and convolutional architectures (Nascimento et al, 2019;Yu et al, 2017;Gasparin et al, 2019;Mukhoty et al, 2019;Wen et al, 2017) 1 . These are Seq2Seq architectures (Sutskever et al, 2014) -which consist of an encoder which takes an input sequence and summarizes it into a fixed-length context vector, and a decoder which produces an output sequence.…”
Section: Introductionmentioning
confidence: 99%
“…Sequence to sequence (S2S) DL models were used to forecast 24-h ahead of GHI. The results revealed superiority of LSTM over FFNN and gradient boosted regression trees algorithms [18]. The method can generally be implemented easily as it requires only historical GHI data.…”
Section: Introductionmentioning
confidence: 95%