2015
DOI: 10.1109/access.2015.2485943
|View full text |Cite
|
Sign up to set email alerts
|

Short-Term Electric Load Forecasting Using Echo State Networks and PCA Decomposition

Abstract: In this paper, we approach the problem of forecasting a time series (TS) of an electrical load measured on the Azienda Comunale Energia e Ambiente (ACEA) power grid, the company managing the electricity distribution in Rome, Italy, with an echo state network (ESN) considering two different leading times of 10 min and 1 day. We use a standard approach for predicting the load in the next 10 min, while, for a forecast horizon of one day, we represent the data with a high-dimensional multi-variate TS, where the nu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
66
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
4
4
1

Relationship

3
6

Authors

Journals

citations
Cited by 126 publications
(66 citation statements)
references
References 44 publications
0
66
0
Order By: Relevance
“…These problems make FNN unsuitable for the local STLF task. On the other hand, recurrent neural networks such as echo state networks and long short term memory networks are widely adopted in STLF [4]. However, these architectures are not considered in this work, as we focus on model-based approaches.…”
Section: Introductionmentioning
confidence: 99%
“…These problems make FNN unsuitable for the local STLF task. On the other hand, recurrent neural networks such as echo state networks and long short term memory networks are widely adopted in STLF [4]. However, these architectures are not considered in this work, as we focus on model-based approaches.…”
Section: Introductionmentioning
confidence: 99%
“…], [14], [25], [42], [45], [78], [101], [110], [122], [124], [132], [156], [164], [171], [178], [187], [210], [211], [213], [228], [237], [242], [256]), Support Vector Machines (SVM) (21) ( [? ], [36], [53], [57], [65], [78], [79], [106], [115], [117], [122], [157], [159], [166], [187], [193], [203], [227], [240], [253], [256]), autoregressive integrated moving average (ARIMA) (13) ( [6], [19], [32], [42], [53], [78], [90],…”
Section: Sms Resultsmentioning
confidence: 99%
“…Some adjustments done on the ESN itself include: adding white noise into internal neurons [27], introducing leaky integrator neurons [28], performing ridge regression instead of linear regression [29], clustering the internal neurons [1], etc. Meanwhile, several attempts have been made to combine ESN with other algorithms, such as decision tree [15], principal component analysis (PCA) [14], and the Bayesian method [12], and a hierarchical architecture of ESN was brought forward [30]. These improvements have been applied into numerous scenarios and achieve good results.…”
Section: Improvements In Esnmentioning
confidence: 99%
“…In most existing frameworks, the learning and training of RNN/ESN are processed in a centralized way [6,8,[12][13][14]18,19]. However, in the era of Big Data, centralized data storage becomes technologically unsuitable [20]; as a result, solutions relying on centralized training in ML could face challenges such as single point of failure, communication bottlenecks, training consistency, etc.…”
Section: Introductionmentioning
confidence: 99%