2019 International Conference on Data Mining Workshops (ICDMW) 2019
DOI: 10.1109/icdmw.2019.00032
|View full text |Cite
|
Sign up to set email alerts
|

Spatiotemporal Attention Networks for Wind Power Forecasting

Abstract: Wind power is one of the most important renewable energy sources and accurate wind power forecasting is very significant for reliable and economic power system operation and control strategies. This paper proposes a novel framework with spatiotemporal attention networks (STAN) for wind power forecasting. This model captures spatial correlations among wind farms and temporal dependencies of wind power time series. First of all, we employ a multi-head self-attention mechanism to extract spatial correlations amon… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 23 publications
(11 citation statements)
references
References 19 publications
0
10
0
Order By: Relevance
“…As shown in Table 8, in multivariate prediction, the prediction effects of Transformerbased methods other than Reformer [56], such as LogTrans [52] and Informer [55], outperformed the RNN-based methods, such as LSTMa [5]; the performance of TCN [22] further outperformed Transformer-based methods; compared with these methods, SCINet model achieved better performance, because the downsample-convolve-interact architecture enabled multi-resolution analysis, which facilitated extracting temporal relation features with enhanced predictability. Overall, in this paper, as shown in all subtasks with ETT data, the prediction performance using SFINet was all better than that using SCINet-see the relative performance improvement given by RIP as shown in green color.…”
Section: Prediction Experiments and Results Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…As shown in Table 8, in multivariate prediction, the prediction effects of Transformerbased methods other than Reformer [56], such as LogTrans [52] and Informer [55], outperformed the RNN-based methods, such as LSTMa [5]; the performance of TCN [22] further outperformed Transformer-based methods; compared with these methods, SCINet model achieved better performance, because the downsample-convolve-interact architecture enabled multi-resolution analysis, which facilitated extracting temporal relation features with enhanced predictability. Overall, in this paper, as shown in all subtasks with ETT data, the prediction performance using SFINet was all better than that using SCINet-see the relative performance improvement given by RIP as shown in green color.…”
Section: Prediction Experiments and Results Analysismentioning
confidence: 99%
“…Therefore, various Transformer-based TSF methods were presented in [51], as shown in Figure 1b. The multi-head self-attention mechanism is used to extract the spatial correlation between wind farms [52]. Models based on convolutional neural networks (CNNs), such as temporal convolutional networks (TCNs), are also used in time series forecasting (TSF) [53,54].…”
Section: Deep Learning-based Wind Power Forecastingmentioning
confidence: 99%
“…Attention was first introduced by Bahdanau et al [22], and has become an integral part of sequence modelling using DL. With regards to wind, attention mechanisms have mainly been used to extract temporal tendencies in time-series data, and thereby improve RNN-based forecasting models [23,24,25]. Due to the success of attention-based sequence models, there has been a surge in the application of such techniques to other DL domains, such as the development of Graph Attention Networks [26,27,28,29].…”
Section: Introductionmentioning
confidence: 99%
“…Since then, this mechanism has been applied to other tasks, including speech emotion recognition [24], music generation [25], and human action recognition [26]. Interestingly, this mechanism can take both temporal and spatial dependencies [23,27] of a sequence of inputs into account.…”
Section: Introductionmentioning
confidence: 99%