2017
DOI: 10.1007/s10955-017-1840-9
|View full text |Cite
|
Sign up to set email alerts
|

Neural Networks Retrieving Boolean Patterns in a Sea of Gaussian Ones

Abstract: Restricted Boltzmann machines are key tools in machine learning and are described by the energy function of bipartite spin-glasses. From a statistical mechanical perspective, they share the same Gibbs measure of Hopfield networks for associative memory. In this equivalence, weights in the former play as patterns in the latter. As Boltzmann machines usually require real weights to be trained with gradient-descent-like methods, while Hopfield networks typically store binary patterns to be able to retrieve, the i… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
84
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
9

Relationship

5
4

Authors

Journals

citations
Cited by 32 publications
(87 citation statements)
references
References 59 publications
3
84
0
Order By: Relevance
“…This result does not depend on the particular pattern distribution P ξ (ξ) (see also [3]), but it does clearly involve the spin priors. With these priors fixed, the transition takes place at an inverse temperature β c (α) > 0 that is a function of α.…”
Section: Transition To the Spin Glass Phasementioning
confidence: 79%
“…This result does not depend on the particular pattern distribution P ξ (ξ) (see also [3]), but it does clearly involve the spin priors. With these priors fixed, the transition takes place at an inverse temperature β c (α) > 0 that is a function of α.…”
Section: Transition To the Spin Glass Phasementioning
confidence: 79%
“…4 The maximal theoretical capacity reaches αc = 2 for asymmetric networks. 5 We stress that the normalization factor N −1 in front of the term σ i σ j mix is appropriately chosen if the unlearning algorithm is iterated O(N ) times. We will deepen this point (the amplitude of the unlearning or consolidating rates) in Section 3 and in the Appendix B.…”
Section: Contentsmentioning
confidence: 99%
“…when dealing with mean-eld spin-glasses [39] and mean-led bipartite spin-glasses [40], the coupling distribution is proved not to aect the resulting pressure, provided that it is centered, symmetrical and with nite variance. This property was extended to the Hopeld model in [12], where it was shown that the quenched noise contribution appearing in the expression for the pressure (which stems from the P − 1 non-retrieved patterns and which tends to inhibit retrieval), exhibits the very same shape, regardless of the nature of the pattern entries (that is, digital e.g., Boolean or analog e.g., Gaussian).…”
Section: Denition 4 the Intensive Quenched Pressure Of The Classicalmentioning
confidence: 97%
“…The latter, exhibiting a cost function that is an (innite and convergent) series of monomials in the microscopic variables (i.e., the neural activities) oers not only a perfect playground where testing our methods, but also an interesting example of dense architectures [28,29]. The translation of the statistical-mechanics problem into a mechanical framework is based on the analogy between the variational principles in statistical mechanics (e.g., maximum entropy and minimal energy) and the least-action principle in analytical mechanics: this route was already paved for ferromagnetic models [3032], for spin glasses [3335] and for (simpler) neural networks [12,27]. A main advantage is that it allows painting the phase diagrams of the model under study by relying upon tools originally developed in the analytical counterpart (i.e.…”
Section: Introductionmentioning
confidence: 99%