2018
DOI: 10.1103/physrevx.8.041029
|View full text |Cite
|
Sign up to set email alerts
|

Optimal Sequence Memory in Driven Random Networks

Abstract: Autonomous randomly coupled neural networks display a transition to chaos at a critical coupling strength. We here investigate the effect of a time-varying input on the onset of chaos and the resulting consequences for information processing. Dynamic mean-field theory yields the statistics of the activity, the maximum Lyapunov exponent, and the memory capacity of the network. We find an exact condition that determines the transition from stable to chaotic dynamics and the sequential memory capacity in closed f… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

12
198
4

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
1

Relationship

3
3

Authors

Journals

citations
Cited by 80 publications
(214 citation statements)
references
References 81 publications
12
198
4
Order By: Relevance
“…An online supervised learning algorithm for RNNs proposed by Sussillo and Abbott [33] exhibits its best performance when their autonomous dynamics is adjusted to the chaotic region not far from the critical point where chaotic dynamics can be suppressed by input signals. Schuecker et al [23] showed that the network memory capacity for continuous-time nonlinear RNNs peaks in the ordered regime with g 2 > 1, which is consistent to our result. They argued that the dynamic suppression of chaos (DSC), which results from the fact that the onset of local instability precedes that of asymptotic instability, contributes to optimal information processing.…”
Section: Discussionsupporting
confidence: 92%
See 3 more Smart Citations
“…An online supervised learning algorithm for RNNs proposed by Sussillo and Abbott [33] exhibits its best performance when their autonomous dynamics is adjusted to the chaotic region not far from the critical point where chaotic dynamics can be suppressed by input signals. Schuecker et al [23] showed that the network memory capacity for continuous-time nonlinear RNNs peaks in the ordered regime with g 2 > 1, which is consistent to our result. They argued that the dynamic suppression of chaos (DSC), which results from the fact that the onset of local instability precedes that of asymptotic instability, contributes to optimal information processing.…”
Section: Discussionsupporting
confidence: 92%
“…In ESNs, the shift of the critical g 2 * toward the chaotic regime induced by input signals is solely due to a mechanism called the static suppression of chaos (SSC), which increases the frequency with which an orbit visits the contracting region of the phase space. Unlike SSC, DSC is conjectured to occur based on fast switching among different unstable directions caused by input signals [23]. However, ESNs with leaky neurons [34] are expected to exhibit DSC [23].…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Other transitions obtained with this model regard the transient time spent to reach the attractor from random inputs [13][14][15] and the size of the basins of attraction of the existing limit cycles [14]. Adding noise [16,17], dilution [18], a gain function [19][20][21] or self-interaction [22] may further expand the array of possible transitions in the system. In particular, adding dilution to fully asymmetric networks results in an increased number of attractors, and therefore to an increased storage capacity [23].…”
Section: Introductionmentioning
confidence: 99%