2012
DOI: 10.1162/neco_a_00200
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent Kernel Machines: Computing with Infinite Echo State Networks

Abstract: Echo State Networks are large, random recurrent neural networks with a single trained linear readout layer. Despite the untrained nature of the recurrent weights, they are capable of performing universal computations on temporal input data, which makes them interesting for both theoretical research and practical applications. The key to their success lies in the fact that the network computes a broad set of nonlinear, spatiotemporal mappings of the input data, on which linear regression or classification can e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
77
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 76 publications
(79 citation statements)
references
References 44 publications
(42 reference statements)
2
77
0
Order By: Relevance
“…This approach has resulted in an entire class of recursive kernels [5]. In fact, any valid recursive kernel can be applied with the online GP to yield an OIESGP.…”
Section: Online Infinite Echo-state Gaussian Processmentioning
confidence: 99%
See 1 more Smart Citation
“…This approach has resulted in an entire class of recursive kernels [5]. In fact, any valid recursive kernel can be applied with the online GP to yield an OIESGP.…”
Section: Online Infinite Echo-state Gaussian Processmentioning
confidence: 99%
“…Leveraging on recent developments in recurrent kernel machines [5] (developed by considering reservoirs of infinite size), our second contribution is novel recursive kernel with automatic relevance determination [6]. When combined with Bayesian online learning, this new algorithm-the online infinite echo-state Gaussian processes (OIESGP)-obviates the need to create and maintain an explicit reservoir.…”
Section: Introductionmentioning
confidence: 99%
“…This is possible because of the fading memory characteristic of RC networks. A third main prominent feature of our approach is that the reservoir, functioning as a temporal non-linear kernel [14], can be used in supervised, unsupervised and reinforcement learning tasks by only changing the training method in the linear output layer, characterizing it as a multi-faceted machine learning method [32].…”
Section: Related Work On Biologically-inspired Navigation Systemsmentioning
confidence: 99%
“…Many applications of RC exist: online adaptive control of robotic arms [10], [11], optoeletronic applications [12], speech recognition [13], etc. From a machine learning perspective, a reservoir network, usually randomly generated and sparsely connected, functions as a temporal kernel [14], projecting the input to a dynamic high-dimensional space. During simulation, the reservoir states form a trajectory which is dependent on the current external sensory input, but which still contains memory traces of previous stimuli.…”
mentioning
confidence: 99%
“…This property is known to be optimal in the critical dynamical regime of the reservoir-a regime in which perturbations to the system's trajectory in its phase space neither spread nor die out. It has been suggested that the reservoir dynamics acts like a spatiotemporal kernel, projecting the input signal onto a high-dimensional feature space [9].…”
Section: Introductionmentioning
confidence: 99%