2008
DOI: 10.1109/tnn.2008.2003249
|View full text |Cite
|
Sign up to set email alerts
|

Boltzmann Machines Reduction by High-Order Decimation

Abstract: Decimation is a common technique in statistical physics that is used in the context of Boltzmann machines (BMs) to drastically reduce the computational cost at the learning stage. Decimation allows to analytically evaluate quantities that should otherwise be statistically estimated by means of Monte Carlo (MC) simulations. However, in its original formulation, this method could only be applied to restricted topologies corresponding to sparsely connected neural networks. In this brief, we present a generalizati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2011
2011
2019
2019

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 13 publications
0
1
0
Order By: Relevance
“…The Parity problem is known to be very difficult to learn with classical neural networks [26,27]. It is easy to understand that this is a very difficult problem to learn in the context of Boltzmann Machines as in a high order model it requires a single weight connecting all units simultaneously [28]. This problem was tested with 8 and 10 input variables (P08 and P10, respectively).…”
Section: Data Setsmentioning
confidence: 99%
“…The Parity problem is known to be very difficult to learn with classical neural networks [26,27]. It is easy to understand that this is a very difficult problem to learn in the context of Boltzmann Machines as in a high order model it requires a single weight connecting all units simultaneously [28]. This problem was tested with 8 and 10 input variables (P08 and P10, respectively).…”
Section: Data Setsmentioning
confidence: 99%