1985
DOI: 10.1109/tit.1985.1057069
|View full text |Cite
|
Sign up to set email alerts
|

Information capacity of the Hopfield model

Abstract: Abstract-The information capacity of general forms of memory is formalized. The number of bits of information that can be stored in the Hopfield model of associative memory is estimated. It is found that the asymptotic information capacity of a Hopfield network of N neurons is of the order N3 b. The number of arbitrary state vectors that can be made stable in a Hopfield network of N neurons is proved to be bounded above by N.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
57
0

Year Published

1987
1987
2017
2017

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 232 publications
(62 citation statements)
references
References 5 publications
1
57
0
Order By: Relevance
“…In this scheme, the basic ART1 model of Carpenter & Grossberg (1,5) that deals with binary feature vectors is considered as the reference system for developing a workable localized-distributed model for feature recognition/classification. This is outlined as follows.…”
Section: Some Neural Net Classifier Modelsmentioning
confidence: 99%
“…In this scheme, the basic ART1 model of Carpenter & Grossberg (1,5) that deals with binary feature vectors is considered as the reference system for developing a workable localized-distributed model for feature recognition/classification. This is outlined as follows.…”
Section: Some Neural Net Classifier Modelsmentioning
confidence: 99%
“…We now need to show that the resulting approximations for the normal entropy and for the binomial entropy are good approximations of each other. (log -> n) 2 /n where C was defined at the beginning of the fi * proof.…”
Section: The Author Believes That the Analysis Would Begin With The Nmentioning
confidence: 99%
“…These models provide a direction for pattern recognition systems with distinct natural advantages. The capacity of these models, as well as their computing power, are directly related to the number of threshold functions [8].…”
Section: Introductionmentioning
confidence: 99%
“…These models provide a direction for pattern recognition systems with distinct natural advantages. The capacity of these models, as well as their computing power, are directly related to the number of threshold functions [8].The ability of multilevel threshold devices to simulate a larger number of functions compared to single-threshold devices is vital for the capacity and capabilities of neural network models based on threshold logic. It is therefore of practical as well as theoretical interest to estimate the number of functions that can be modeled as multilevel threshold functions for a given number of inputs and threshold levels.…”
mentioning
confidence: 99%