2018
DOI: 10.48550/arxiv.1808.04873
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Generalization of Equilibrium Propagation to Vector Field Dynamics

Abstract: The biological plausibility of the backpropagation algorithm has long been doubted by neuroscientists. Two major reasons are that neurons would need to send two different types of signal in the forward and backward phases, and that pairs of neurons would need to communicate through symmetric bidirectional connections. We present a simple two-phase learning procedure for fixed point recurrent networks that addresses both these issues. In our model, neurons perform leaky integration and synaptic weights are upda… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
23
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1
1

Relationship

3
4

Authors

Journals

citations
Cited by 13 publications
(23 citation statements)
references
References 19 publications
0
23
0
Order By: Relevance
“…Our work shares similarities and the final goal with a whole field of research that aims to improve current neural networks by using techniques from computational neuroscience. In fact, the biological implausibility and limitations of BP highlighted in [39] have fueled research in finding a new learning algorithm to train ANNs, with the most promising candidates being energy-based models such as equilibrium propagation [40,41], Boltzmann machines [31] (including deep Boltzmann machines [42] and deep belief networks [43]), and PC [7]. The latter is the main field of interest in this paper.…”
Section: Related Workmentioning
confidence: 99%
“…Our work shares similarities and the final goal with a whole field of research that aims to improve current neural networks by using techniques from computational neuroscience. In fact, the biological implausibility and limitations of BP highlighted in [39] have fueled research in finding a new learning algorithm to train ANNs, with the most promising candidates being energy-based models such as equilibrium propagation [40,41], Boltzmann machines [31] (including deep Boltzmann machines [42] and deep belief networks [43]), and PC [7]. The latter is the main field of interest in this paper.…”
Section: Related Workmentioning
confidence: 99%
“…We adapt the three-phases procedure detailed above to this setting to compute the common update of forward and backward weights, thereby defining the gradient estimate ∇KP−VF sym . We also define ∇VF sym the gradient estimate obtained by applying the three-phases procedure to the former VF approach of Scellier et al [2018], Ernoult et al [2020]. Experimental results.…”
Section: Contributions Of This Work : Scaling Ep Trainingmentioning
confidence: 99%
“…We also propose to implement the neural network predictor as an external softmax readout, subsequently allowing us to use the cross-entropy loss, contrary to previous approaches using the squared error loss. Finally, based on ideas of Scellier et al [2018] and Kolen and Pollack [1994], we adapt the learning rule of EP for architectures with distinct forward and backward connections, yielding only 1.5% accuracy degradation on CIFAR-10 compared to bidirectional connections. Background Learning in hardware.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…EqProp is a general algorithm for computing error gradients in EBMs, inspired by the contrastive learning algorithm for Boltzmann machines [Ackley et al, 1985] and Hopfield networks [Movellan, 1991]. Previous works have mostly studied EqProp in the setting of classification tasks to train the Hopfield model and variants of it [Scellier and Bengio, 2017, Scellier et al, 2018, Khan, 2018, O'Connor et al, 2018, O'Connor et al, 2019, Ernoult et al, 2019, Zoppo et al, 2020. However, EqProp is a much more general method.…”
Section: Related Workmentioning
confidence: 99%