2019
DOI: 10.1016/j.neunet.2019.03.005
|View full text |Cite
|
Sign up to set email alerts
|

Recent advances in physical reservoir computing: A review

Abstract: Reservoir computing is a computational framework suited for temporal/sequential data processing. It is derived from several recurrent neural network models, including echo state networks and liquid state machines. A reservoir computing system consists of a reservoir for mapping inputs into a high-dimensional space and a readout for pattern analysis from the high-dimensional states in the reservoir. The reservoir is fixed and only the readout is trained with a simple method such as linear regression and classif… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
873
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 1,218 publications
(877 citation statements)
references
References 252 publications
4
873
0
Order By: Relevance
“…Our modeling framework is poised to address a broad spectrum of applications in machine learning of natural and artificial signals. With recent advances in reservoir computing (56) and its physical implementations (57), our approach offers an alternative to using external arbitrary time-varying signals to control the dynamics of a recurrent network. Our model may also be extended to neuromorphic hardware, where it may benefit chaotic networks employed in robotic motor control (58).…”
Section: Discussionmentioning
confidence: 99%
“…Our modeling framework is poised to address a broad spectrum of applications in machine learning of natural and artificial signals. With recent advances in reservoir computing (56) and its physical implementations (57), our approach offers an alternative to using external arbitrary time-varying signals to control the dynamics of a recurrent network. Our model may also be extended to neuromorphic hardware, where it may benefit chaotic networks employed in robotic motor control (58).…”
Section: Discussionmentioning
confidence: 99%
“…Then, can DynMat be applied to recurrent networks? It could be possible to use DynMat as a reading layer for Echo or Liquid state machines, which have been successfully applied to multiple real-world problems (Jaeger, 2007;Tanaka et al, 2018). Liquid (or Echo) state machines (LSM) generate dynamically changing patterns which are read out by linear layer to perform computations (Jaeger, 2007;Maass, Natschl, & Markram, 2003).…”
Section: Potential Extension Of Dynmat For Recurrent Networkmentioning
confidence: 99%
“…An echo state network is a type of recurrent neural network (RNN), in a classical approach characterized by a randomly generated hidden layer with untrained connections, where only the output weights are subject to supervised training [21,22]. Such a hidden layer is called reservoir [23], and it is able to memorize the time series fed to the network. The reservoir should satisfy the echo state property, i.e., the state of the reservoir should be uniquely defined by the fading history of the input signal.…”
Section: Echo State Network Architecturementioning
confidence: 99%