2011
DOI: 10.1103/physreve.83.056114
|View full text |Cite
|
Sign up to set email alerts
|

Inference and learning in sparse systems with multiple states

Abstract: We discuss how inference can be performed when data are sampled from the nonergodic phase of systems with multiple attractors. We take as a model system the finite connectivity Hopfield model in the memory phase and suggest a cavity method approach to reconstruct the couplings when the data are separately sampled from few attractor states. We also show how the inference results can be converted into a learning protocol for neural networks in which patterns are presented through weak external fields. The protoc… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
27
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(27 citation statements)
references
References 25 publications
0
27
0
Order By: Relevance
“…Our first result, equation (8), is the saddle point equation correspondent to the regime where α is of order 1. This equation admits only one solution, which is the only minimum of the free energy, for all values of α.…”
Section: Discussionmentioning
confidence: 81%
See 1 more Smart Citation
“…Our first result, equation (8), is the saddle point equation correspondent to the regime where α is of order 1. This equation admits only one solution, which is the only minimum of the free energy, for all values of α.…”
Section: Discussionmentioning
confidence: 81%
“…Being idealizations used to model some aspects of the brain's behavior, they represent a new approach to the problem of computation, based on a paralleled processing of information. As a consequence, neural networks research is multidisciplinary; models inspired in biologic observations have been used to better understand emergent phenomena [1], pattern recognition and task reproduction [2], associative memory capacity [3] and neural developing [4]; several aspect of the learning process have been investigated by using recurrent [5] and spiking networks [6]; applications using neural networks have been recently developed for credit assignment [7] and Bayesian inference [8]. These research has also played a complementary role to studies in vivo [9][10][11].…”
Section: Introductionmentioning
confidence: 99%
“…The authors of Ref. [29] proposed a method for this task that works much better than Hebb's rule on sparse networks. Their method uses belief propagation to estimate correlation of samplings (patterns) and for learning of couplings.…”
Section: Conclusion and Discussionmentioning
confidence: 99%
“…A large amount of studies have been devoted to trying to increase capacity and the retrieval abilities of neural networks by optimizing the learning rule or by optimizing the topology of the network [23][24][25][26][27][28][29], In the following text of the paper we are going to discuss optimizing the topology of the network. We know that a fully connected Hopfield network can store a number of patterns proportional to the number ol neurons, but the fully connected topology is not biologicaly realistic.…”
Section: B Controlling the Hopfield Network Using The Nonbacktrackimentioning
confidence: 99%
“…Previous studies of the inverse Ising problem on Hopfield model [6][7][8][9][10] lack a systematic analysis for treating sparse networks. Inference of the sparse network also have important and wide applications in modeling vast amounts of biological data.…”
Section: Introductionmentioning
confidence: 99%