Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2006
DOI: 10.1103/physrevlett.96.030201
|View full text |Cite
|
Sign up to set email alerts
|

Learning by Message Passing in Networks of Discrete Synapses

Abstract: We show that a message-passing process allows us to store in binary "material" synapses a number of random patterns which almost saturate the information theoretic bounds. We apply the learning algorithm to networks characterized by a wide range of different connection topologies and of size comparable with that of biological systems (e.g., [EQUATION: SEE TEXT]). The algorithm can be turned into an online-fault tolerant-learning protocol of potential interest in modeling aspects of synaptic plasticity and in b… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

9
211
0

Year Published

2006
2006
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 134 publications
(222 citation statements)
references
References 37 publications
9
211
0
Order By: Relevance
“…Additionally, it is the most commonly used measure in recent publications in this field (e.g., in Ref. [27]). We use the cost function (2) as a measure used for the performance of the studied algorithms; however, the algorithms themselves have been derived by statistical physics methods and rely on minimization of an extensive energy given by the number of misclassified patterns…”
Section: Exemplar Problem: the Binary Ising Perceptronmentioning
confidence: 99%
See 2 more Smart Citations
“…Additionally, it is the most commonly used measure in recent publications in this field (e.g., in Ref. [27]). We use the cost function (2) as a measure used for the performance of the studied algorithms; however, the algorithms themselves have been derived by statistical physics methods and rely on minimization of an extensive energy given by the number of misclassified patterns…”
Section: Exemplar Problem: the Binary Ising Perceptronmentioning
confidence: 99%
“…The reason for choosing PT is that it is a well established parallel algorithm with good performance in searching for solutions in the BIP capacity problem. Other derivatives of BP-based algorithms have been used to solve the BIP capacity problem, for instance, survey propagation [6,27]; the latter also aims to address the fragmentation of solution space but employs a different approach. The results reported [6,27] show that solutions can be found very close to the theoretical limits even for large systems, but additional practical techniques and considerations should be used to successfully obtain solutions.…”
Section: Performancementioning
confidence: 99%
See 1 more Smart Citation
“…25-27, which have considered similar graph-theoretic problems † † . BP allows one to derive results on ensembles of graphs in the limit jNj → ∞ (18) but it also provides efficient heuristic algorithms to find collaborative equilibria on given graph instances (28,29). The messages which are exchanged in the BP algorithm are the probabilities μ i→j ¼ Pfi ∈ Γ j ⋂ Cg that player i collaborates and punishes j, in the collaborative equilibrium ‡ ‡ .…”
Section: The Complexity Of Collaboration On Networkmentioning
confidence: 99%
“…It was shown empirically that SP rarely makes any mistakes in its decimation, and SP solves very large 3-SAT instances that are very hard for local search algorithms. Recently, Braunstein and Zecchina (2006) have shown that by modifying BP and SP updates with a reinforcement term, the effectiveness of these algorithms as solvers can be further improved.…”
Section: Related Work On Satmentioning
confidence: 99%