1985
DOI: 10.1051/jphyslet:01985004608035900
|View full text |Cite
|
Sign up to set email alerts
|

Information storage and retrieval in spin-glass like neural networks

Abstract: 2014 Dans la perspective de la réalisation de mémoires associatives à l'aide de réseaux de neurones, nous étudions la relation entre la structure d'un réseau et ses états attracteurs; nous montrons que, quel que soit l'ensemble des états que l'on désire mémoriser, il est généralement possible de calculer tous les paramètres du réseau de façon à assurer la stabilité de ces états. Le formalisme des verres de spins conduit à des résultats particulièrement simples qui permettent, dans certains cas, d'evaluer analy… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

5
108
0
1

Year Published

1990
1990
2013
2013

Publication Types

Select...
3
3
3

Relationship

0
9

Authors

Journals

citations
Cited by 310 publications
(114 citation statements)
references
References 13 publications
5
108
0
1
Order By: Relevance
“…. , u {"~ are the eigenvectors of W ~ and A is the spectrum of W ~ (Venkatesh & Psaltis, 1985;Personnaz, Guyon, & Dreyfus, 1985;Venkatesh & Psaltis, 1989 2 (") = 2 > 0, we see that the matrix W" is symmetric with nonnegative eigenvalues (i.e., it is nonnegative definite). Therefore there exist Lya, punov functions in this case, and moreover it has been shown that the stored memories form global energy minima (Venkatesh & Psattis, 1989).…”
Section: The Spectral Algorithmmentioning
confidence: 96%
See 1 more Smart Citation
“…. , u {"~ are the eigenvectors of W ~ and A is the spectrum of W ~ (Venkatesh & Psaltis, 1985;Personnaz, Guyon, & Dreyfus, 1985;Venkatesh & Psaltis, 1989 2 (") = 2 > 0, we see that the matrix W" is symmetric with nonnegative eigenvalues (i.e., it is nonnegative definite). Therefore there exist Lya, punov functions in this case, and moreover it has been shown that the stored memories form global energy minima (Venkatesh & Psattis, 1989).…”
Section: The Spectral Algorithmmentioning
confidence: 96%
“…The memory storage capacity for this method is n/4 log n (McEliece, Posner, Rodermich, & Venkatesh, 1987;Psaltis & Venkatesh, 1989) whereas the maximal theoretical capacity for any storage algorithm is 2n (Cover, 1965;Venkatesh, 1986b). The spectral algorithm (Kohonen, 1977;Personnaz, Guyon, & Dreyfus, 1985;Venkatesh & Psaltis, 1989) and an algorithm we will refer to as the dual spectral algorithm (Maruani, Chevallier, & Sirat, 19871 are algorithms whose capacities Acknowledgement: The work of the first two authors was supported in part by NSF grant . The work at Caltech is supported by DARPA and AFOSR.…”
Section: Introductionmentioning
confidence: 99%
“…The Hebbian learning paradigm in Hopfield networks provides a biologically somewhat plausible and a mathematically simple rule amongst other choices (see for example [5], [22]). We thus consider a Hopfield network with a training set of random patterns except that some of these patterns are multiply stored while others, called simple patterns, are stored once as usual.…”
Section: Introductionmentioning
confidence: 99%
“…However, this problem gets very simple in a special case, that is if we impose zero temperature (β = 0). In such a case we just have to solve the following system of equations (cf [12], [9] for a different approach at zero temperature)…”
Section: Experimental and Theoretical Resultsmentioning
confidence: 99%