1998
DOI: 10.1088/0305-4470/31/43/005
|View full text |Cite
|
Sign up to set email alerts
|

Phase diagram and storage capacity of sequence processing neural networks

Abstract: We solve the dynamics of Hopfield-type neural networks which store sequences of patterns, close to saturation. The asymmetry of the interaction matrix in such models leads to violation of detailed balance, ruling out an equilibrium statistical mechanical analysis. Using generating functional methods we derive exact closed equations for dynamical order parameters, viz. the sequence overlap and correlationand response functions, in the thermodynamic limit. We calculate the time translation invariant solutions of… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

7
81
0

Year Published

2004
2004
2015
2015

Publication Types

Select...
7
2
1

Relationship

1
9

Authors

Journals

citations
Cited by 47 publications
(88 citation statements)
references
References 17 publications
7
81
0
Order By: Relevance
“…In the case of L = 1 c = 1 L = 1.0 , which is fully connected with no delay elements, the recurrent neural network's storage capacity α C for sequential association is 0.269. This agrees with the results of the previous works [29,30,31]. As the length of delay L increases, storage capacity α C increases even though the total number of synapses is constant.…”
Section: Random Pruningsupporting
confidence: 92%
“…In the case of L = 1 c = 1 L = 1.0 , which is fully connected with no delay elements, the recurrent neural network's storage capacity α C for sequential association is 0.269. This agrees with the results of the previous works [29,30,31]. As the length of delay L increases, storage capacity α C increases even though the total number of synapses is constant.…”
Section: Random Pruningsupporting
confidence: 92%
“…During et al [6] analyzed the properties of the stationary states (the storage capacity and the phase diagram) of a recurrent network for purely ASP, without pattern reconstruction, and a solution for the transient dynamics of the model was recently discussed [7]. The effects of stochastic noise on the model [4] were only recently analyzed in a feed-forward neural network architecture [1], in which an exact solution for the dynamics and complete phase diagrams of stationary states were obtained.…”
Section: Introductionmentioning
confidence: 99%
“…Also for networks that act as memory for sequences, capacities have been calculated in both the perceptron (Nadal, 1991) and the attractor case (Herz, Li, & van Hemmen, 1991). An important result is that the capacity of sequence memory in Hopfield-type networks is about twice as large as that of a static attractor network (Düring, Coolen, & Sherington, 1998).…”
Section: Introductionmentioning
confidence: 99%