2017
DOI: 10.48550/arxiv.1711.08416
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Equivalence of Equilibrium Propagation and Recurrent Backpropagation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
7
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 0 publications
1
7
0
Order By: Relevance
“…In this appendix we study the dynamics itself in the second phase when the state of the network moves from the free fixed point s 0 θ to the weakly clamped fixed point s β θ . The result established in this section is a straightforward generalization of the result proved in Scellier and Bengio [2017b].…”
Section: B Link To Recurrent Backpropagationsupporting
confidence: 60%
See 2 more Smart Citations
“…In this appendix we study the dynamics itself in the second phase when the state of the network moves from the free fixed point s 0 θ to the weakly clamped fixed point s β θ . The result established in this section is a straightforward generalization of the result proved in Scellier and Bengio [2017b].…”
Section: B Link To Recurrent Backpropagationsupporting
confidence: 60%
“…and by Theorems 3 and 4 we get S t = S t and Θ t = Θ t . This result was stated and proved in Scellier and Bengio [2017b].…”
Section: Appendixmentioning
confidence: 63%
See 1 more Smart Citation
“…RBP is naturally applicable here and demonstrated to save both computation time and memory. A recent investigation (Scellier & Bengio, 2017a) shows that RBP is related to equilibrium propagation (Scellier & Bengio, 2017b) which is motivated from the perspective of biological plausibility. Another recent related work in deep learning is OptNet (Amos & Kolter, 2017) where the gradient of the optimized solution of a quadratic programming problem w.r.t.…”
Section: Related Workmentioning
confidence: 99%
“…Only recently have biological plausible proposals been made to adapt backpropagation to spiking and continuous time neural networks with reasonable success. Local contrastive Hebbian learning in energy based models are one class of proposals that have shown a level of equivalence with recurrent backpropagation [6,7,8]. In these learning algorithms, the networks minimizes the difference between the fixed points when running freely and when clamped to an external input for error correction.…”
Section: Introductionmentioning
confidence: 99%