2021
DOI: 10.1016/j.aej.2020.10.045
|View full text |Cite
|
Sign up to set email alerts
|

Multistep short-term wind speed prediction using nonlinear auto-regressive neural network with exogenous variable selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
15
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 34 publications
(15 citation statements)
references
References 26 publications
0
15
0
Order By: Relevance
“…The third and last scenario is designed to show the performance of the optimizing ensemble-based AD-PSO-Guided WOA algorithm compared with PSO [18], WOA [22], GA [23], GWO [19], HHO [25], [26], MPA [27], ChOA [28], and SMA [29]. The AD-PSO-Guided WOA algorithm ensemble model is also compared with four deep learning techniques including TDNN [30], DNN [31], SAE [32], and BRNN [33].…”
Section: Comparisons Scenariomentioning
confidence: 99%
See 1 more Smart Citation
“…The third and last scenario is designed to show the performance of the optimizing ensemble-based AD-PSO-Guided WOA algorithm compared with PSO [18], WOA [22], GA [23], GWO [19], HHO [25], [26], MPA [27], ChOA [28], and SMA [29]. The AD-PSO-Guided WOA algorithm ensemble model is also compared with four deep learning techniques including TDNN [30], DNN [31], SAE [32], and BRNN [33].…”
Section: Comparisons Scenariomentioning
confidence: 99%
“…The AD-PSO-Guided WOA algorithm ensemble model is compared with the state-of-the-art optimization techniques including PSO [18],WOA [22], GA [23], GWO [19], Harris Hawks Optimization (HHO) [25], [26], Marine Predators Algorithm (MPA) [27], Chimp Optimization Algorithm (ChOA) [28], and Slime Mould Algorithm (SMA) [29]. The AD-PSO-Guided WOA algorithm ensemble model is also compared with the state-of-the-art deep learning techniques including Time delay neural network (TDNN) [30], Deep Neural Networks (DNN) [31], Stacked Denoising Autoencoder (SAE) [32], and Bidirectional Recurrent Neural Networks (BRNN) [33]. The statistical analysis of different tests is performed to confirm the accuracy of the algorithm, including Wilcoxon's rank-sum and one-way analysis of variance (ANOVA).…”
Section: Introductionmentioning
confidence: 99%
“…More operations are required with few algorithms employed 28–40 Premature and delayed convergence due to the global stuck of the algorithms 64–67 Loss of information and henceforth lacks in prediction accuracy 14,23,24,49 …”
Section: Introductionmentioning
confidence: 99%
“…[24][25][26] • Difficult to maintain the stability of the algorithms. [61][62][63][64][65] • More operations are required with few algorithms employed. [28][29][30][31][32][33][34][35][36][37][38][39][40] • Premature and delayed convergence due to the global stuck of the algorithms.…”
mentioning
confidence: 99%
“…A major research line has been dedicated to time series analysis. In this line, exponential smoothing models (Gardner Jr and Everette, 2006;Bergmeir et al, 2016), Box and Jenkins models (Box et al, 2015) and nonlinear autoregressive neural networks (Yu et al, 2014;Wang et al, 2019;Noman et al, 2020) are essentially devoted to forecasting. In addition to the forecasting goal, regime-switching autoregressive models (Ubilava and Helmers, 2013;Hamilton, 1990) also allow to discover hidden behaviors of such systems.…”
Section: Introductionmentioning
confidence: 99%