2019
DOI: 10.1103/physreve.100.062312
|View full text |Cite
|
Sign up to set email alerts
|

Optimal short-term memory before the edge of chaos in driven random recurrent networks

Abstract: The ability of discrete-time nonlinear recurrent neural networks to store time-varying small input signals is investigated by mean-field theory. The combination of a small input strength and mean-field assumptions makes it possible to derive an approximate expression for the conditional probability density of the state of a neuron given a past input signal. From this conditional probability density, we can analytically calculate short-term memory measures, such as memory capacity, mutual information, and Fishe… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
12
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1
1

Relationship

3
5

Authors

Journals

citations
Cited by 21 publications
(13 citation statements)
references
References 34 publications
1
12
0
Order By: Relevance
“…In the time-series prediction task, we observe a remarkable improvement of the RC's performance near the phase transition of its attractor. Although our result for the Gaussian network is essentially equivalent to the one discussed in previous studies [54][55][56][57][58][59][60], it provides us with a new perspective on the boundary between chaotic and polarized ordered phases. Particularly interesting is that the Gamma network, where higher-order statistics play a crucial role, exhibits computational improvement even near the boundary of a non-chaotic phase.…”
Section: Introductionsupporting
confidence: 71%
See 2 more Smart Citations
“…In the time-series prediction task, we observe a remarkable improvement of the RC's performance near the phase transition of its attractor. Although our result for the Gaussian network is essentially equivalent to the one discussed in previous studies [54][55][56][57][58][59][60], it provides us with a new perspective on the boundary between chaotic and polarized ordered phases. Particularly interesting is that the Gamma network, where higher-order statistics play a crucial role, exhibits computational improvement even near the boundary of a non-chaotic phase.…”
Section: Introductionsupporting
confidence: 71%
“…Especially for the case where J ij is a Gaussian variate, the mean-field theory for these discrete-time models has already been investigated in several kinds of literature [4,13,60,77]. Qualitative differences between continuous-time models and discrete-time models also have been suggested in Ref.…”
Section: Effective Equation Of Motionmentioning
confidence: 99%
See 1 more Smart Citation
“…The typical case is evaluating how well the given reservoir can output the previous input sequence, and this measure is called memory capacity [46]. Focusing on ESN, the behaviors of memory capacity and their related measures are studied in detail with a linear activation function [46][47][48][49][50][51] and, recently, with a nonlinear activation function [52][53][54]. This measure is further generalized and extended to be able to evaluate the nonlinear memory capacities by decomposing the function φ into the combinations of multiple orthogonal polynomials [55], and the trade-off between the expressiveness of φ for linear and nonlinear functions is investigated [55,56].…”
Section: Prerequisite For a Successful Reservoirmentioning
confidence: 99%
“…Still, another domain of complexity-chaos-has been far less understood [16,17]. Even though enormous theory/data-driven studies on forecasting chaotic time series by means of recurrent NN have been conducted [18][19][20][21][22][23][24][25], there is still a lack of consensus on which features play the most important roles in the forecasting methods.…”
Section: Introductionmentioning
confidence: 99%