1998
DOI: 10.1142/s0129183198000595
|View full text |Cite
|
Sign up to set email alerts
|

On the Phase Transition of Hopfield Networks — Another Monte Carlo Study

Abstract: A Hopfield-type neural network has content addressable memory which emerges from its collective properties. I reinvestigate the controversial question of its critical storage capacity at zero temperature. To locate the discontinuous transition from good retrieval to bad retrieval in infinite systems the decreasing average quality of retrieved information is traced until it falls below a threshold. The cutoff points found for different system sizes are extrapolated towards infinity and yield αc=0.143±0.002.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
3
0

Year Published

1999
1999
2022
2022

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 11 publications
(5 citation statements)
references
References 6 publications
2
3
0
Order By: Relevance
“…Posterior computer simulations of larger Hopfield networks and its extrapolation to 1/N ! 0 resulted in a value of n c % 0.14 N (Volk 1998) which agrees with a theoretical value of n c ¼ 0.138 N obtained within a statistical physics mean-field approach (Amit et al 1987). This memory capacity can be increased assuming some conditions over the network topology and the stored patterns.…”
Section: Memory Capacitysupporting
confidence: 86%
“…Posterior computer simulations of larger Hopfield networks and its extrapolation to 1/N ! 0 resulted in a value of n c % 0.14 N (Volk 1998) which agrees with a theoretical value of n c ¼ 0.138 N obtained within a statistical physics mean-field approach (Amit et al 1987). This memory capacity can be increased assuming some conditions over the network topology and the stored patterns.…”
Section: Memory Capacitysupporting
confidence: 86%
“…Posterior numerical simulations of larger Hopfield networks and its extrapolation to 1/N ! 0 resulted in a value of n c % 0.14 N (Volk 1998) which agrees with a theoretically value of n c = 0.138 N obtained within a statistical physics mean-field approach (Amit et al 1987). This memory capacity can be increased assuming some conditions over the network topology and the stored patterns.…”
Section: Memory Capacitysupporting
confidence: 85%
“…For infinite range, m = N, the usual Hopfield model [12] gives an overlap Ψ close to 1 for P/N < 0.14 and a relatively small overlap Ψ ∼ 0.2 for P/N > 0.14, with a sharp jump at P/N = 0.14. Our simulations, in contrast, show a gradual deterioration as soon as more than one pattern is stored, but the value of Ψ is still of order 0.2 and distinctly larger than for the other (P − 1) patterns.…”
mentioning
confidence: 96%