2004
DOI: 10.1103/physrevlett.92.148102
|View full text |Cite
|
Sign up to set email alerts
|

Short-Term Memory in Orthogonal Neural Networks

Abstract: We study the ability of linear recurrent networks obeying discrete time dynamics to store long temporal sequences that are retrievable from the instantaneous state of the network. We calculate this temporal memory capacity for both distributed shift register and random orthogonal connectivity matrices. We show that the memory capacity of these networks scales with system size.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
190
2

Year Published

2004
2004
2022
2022

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 145 publications
(195 citation statements)
references
References 5 publications
3
190
2
Order By: Relevance
“…During prediction, the true outputs (which are unknown) are replaced with the predicted outputs, which are fed to the recurrent units. ESNs work extremely well for predicting chaotic systems for a large number of timesteps, and their linear version has been rigorously analyzed (White et al, 2004). The major advantage of the ESN is that its recurrent weights are not learned, so learning is extremely fast.…”
Section: Related Workmentioning
confidence: 99%
“…During prediction, the true outputs (which are unknown) are replaced with the predicted outputs, which are fed to the recurrent units. ESNs work extremely well for predicting chaotic systems for a large number of timesteps, and their linear version has been rigorously analyzed (White et al, 2004). The major advantage of the ESN is that its recurrent weights are not learned, so learning is extremely fast.…”
Section: Related Workmentioning
confidence: 99%
“…Extensions of the delay ring are the ensemble of orthogonal networks (studied in ref. 7). These are normal networks in which W is a rotation matrix.…”
Section: Examples Of Normal Networkmentioning
confidence: 99%
“…To what extent do these traces degrade in the presence of noise? Previous analytical work has addressed some of these questions under restricted assumptions about input statistics and network architectures (7). To answer these questions in a more general setting, we use Fisher information to construct a measure of memory traces in networks and other dynamical systems.…”
mentioning
confidence: 99%
“…At the beginning of each cycle, a subset of granule cells received a depolarizing input to represent a postsynaptic potential that initiated the recurrent dynamics. The 100 granule cells were interconnected by weights defined by an orthogonal random weight matrix [14] with eigenvalues o1 [8]. These weights were not varied during adaptation and each synapse contributed to the membrane potential of the postsynaptic granule cell proportionally to the presynaptic spike probability.…”
Section: Resultsmentioning
confidence: 99%