Proceedings of the 7th Annual Neuro-Inspired Computational Elements Workshop 2019
DOI: 10.1145/3320288.3320303
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of Wide and Deep Echo State Networks for Multiscale Spatiotemporal Time Series Forecasting

Abstract: Echo state networks are computationally lightweight reservoir models inspired by the random projections observed in cortical circuitry. As interest in reservoir computing has grown, networks have become deeper and more intricate. While these networks are increasingly applied to nontrivial forecasting tasks, there is a need for comprehensive performance analysis of deep reservoirs. In this work, we study the influence of partitioning neurons given a budget and the effect of parallel reservoir pathways across di… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 33 publications
0
4
0
Order By: Relevance
“…Further, the reservoirs in different modules could have the cross connections, shown in Figure 3, Cross DeepESN. Based on the above connection structures, Carmichael et al [109,110] designed a DeepESN with modular architecture (Mod-DeepESN), which allowed for varying topologies of deep ESNs.…”
Section: Designsmentioning
confidence: 99%
“…Further, the reservoirs in different modules could have the cross connections, shown in Figure 3, Cross DeepESN. Based on the above connection structures, Carmichael et al [109,110] designed a DeepESN with modular architecture (Mod-DeepESN), which allowed for varying topologies of deep ESNs.…”
Section: Designsmentioning
confidence: 99%
“…The structure of the deep ESN attempts to learn multiscale temporal interactions as we saw in the hidden units for the settling pattern prediction (Carmichael et al, 2019).…”
Section: Discussionmentioning
confidence: 99%
“…The single reservoir ESN (1.4) cannot learn instantaneous relationships well (i.e., the relationship between now and the current inputs.) However, deepening and widening the ESN improves the learning of important multiscale features, which lends to its success in forecasting problems (Ma et al, 2017;Carmichael et al, 2019;McDermott and Wikle, 2019b). The widening of the ESN provides an ensemble of feedforward reservoirs.…”
Section: Leaky Integrator Esnmentioning
confidence: 99%
“…Going towards the same direction, the Deep ESN proposed by (GALLICCHIO;MICHELI, 2016) gives a model to split the reservoir into smaller networks forming interconnected layers, analog to the different scales of brain structure. All these works show that a better performance can be achieved using a clustered network rather than a random network (DETTORI et al, 2020;CARMICHAEL;SYED;KUDITHIPUDI, 2019). Therefore, in this work, we will explore the flexibility of the reservoir by replacing the random network for a clustered network since previous works show that an improvement in the performance can be achieved with these non-random network topologies.…”
Section: List Of Tablesmentioning
confidence: 99%