2007
DOI: 10.1016/j.neunet.2007.04.003
|View full text |Cite
|
Sign up to set email alerts
|

An experimental unification of reservoir computing methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

8
652
0
3

Year Published

2008
2008
2023
2023

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 875 publications
(667 citation statements)
references
References 20 publications
8
652
0
3
Order By: Relevance
“…This point has been widely discussed and confirmed by studies on the network dynamics proving that a spectral radius close to 1 is an optimal value. However, we share the opinion of Verstraeten et al [49] who claim that ''for spiking neurons it has no influence at all''. In [43] Steil shows that a learning rule based on IP has the effect to expand the eigenvalues away from the center of the unit disk.…”
Section: Article In Presssupporting
confidence: 85%
“…This point has been widely discussed and confirmed by studies on the network dynamics proving that a spectral radius close to 1 is an optimal value. However, we share the opinion of Verstraeten et al [49] who claim that ''for spiking neurons it has no influence at all''. In [43] Steil shows that a learning rule based on IP has the effect to expand the eigenvalues away from the center of the unit disk.…”
Section: Article In Presssupporting
confidence: 85%
“…Then, the learning side is concentrated on the readout part. This approach is based on the empirical observation that under certain hypothesis, a learning process restricted to the readout weights is often sufficient to obtain an excellent performance in many learning tasks [5]- [8].…”
Section: Description Of the Methods Proposedmentioning
confidence: 99%
“…The first two proposed RC models were Liquid State Machines (LSMs) [6] and Echo State Networks (ESNs) [7], both almost simultaneously published. The two types of models have been successfully applied in many problems achieving very good results in temporal and non-temporal learning tasks [5], [8], [9].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Each of the layers in the proposed hierarchy uses a random dynamical system of which only the readout layer is trained (eg. Reservoir Computing systems [7]). On the lowest level, the fastest timescale, the passive compliance of the leg interacts with the environment.…”
Section: Introductionmentioning
confidence: 99%