2017
DOI: 10.1007/s10955-017-1806-y
|View full text |Cite
|
Sign up to set email alerts
|

On a Model of Associative Memory with Huge Storage Capacity

Abstract: Abstract. In [7] Krotov and Hopfield suggest a generalized version of the wellknown Hopfield model of associative memory. In their version they consider a polynomial interaction function and claim that this increases the storage capacity of the model. We prove this claim and take the "limit" as the degree of the polynomial becomes infinite, i.e. an exponential interaction function. With this interaction we prove that model has an exponential storage capacity in the number of neurons, yet the basins of attracti… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
50
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
3
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 66 publications
(58 citation statements)
references
References 18 publications
0
50
0
Order By: Relevance
“…In particular, we need a model with higher capacity for storing patterns than an MSOM, which is constrained to encode input patterns in relatively localist representations. But such networks are now readily available: for instance, the new generation of Hopfield networks perform the same kind of unsupervised associative learning as an MSOM, but with vastly higher capacity (see e.g., Krotov and Hopfield, 2016 , and especially Demircigil et al, 2017 ). Of course, the question of scalability remains to be addressed empirically—and this is something we plan to pursue in future work.…”
Section: Discussionmentioning
confidence: 99%
“…In particular, we need a model with higher capacity for storing patterns than an MSOM, which is constrained to encode input patterns in relatively localist representations. But such networks are now readily available: for instance, the new generation of Hopfield networks perform the same kind of unsupervised associative learning as an MSOM, but with vastly higher capacity (see e.g., Krotov and Hopfield, 2016 , and especially Demircigil et al, 2017 ). Of course, the question of scalability remains to be addressed empirically—and this is something we plan to pursue in future work.…”
Section: Discussionmentioning
confidence: 99%
“…Furthermore, certain RCNNs can be viewed as kernelized versions of the Hopfield network with Hebbian learning [16,17,18]. Finally, RCNNs are closely related to the dense associative memory model introduced by Krotov and Hopfield to establish the duality between associative memories and deep learning [44,45].…”
Section: Quaternion-valued Recurrent Correlation Neural Networkmentioning
confidence: 99%
“…In [30] it is proved that for ()exp() Fuu = this memory allows one to retrieve exp() ND a = randomly distorted vectors (within Ham dist/2 D < from the stored vectors) by a single step of the sequential dynamics, for some 0ln2/2 a << , depending on the distortion, with probability converging to 1 for D ®¥.…”
Section: The Generalization Of Krotov-hopfieldmentioning
confidence: 99%