2021
DOI: 10.48550/arxiv.2112.11399
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Physical learning beyond the quasistatic limit

Menachem Stern,
Sam Dillavou,
Marc Z. Miskin
et al.

Abstract: Physical networks, such as biological neural networks, can learn desired functions without a central processor, using local learning rules in space and time to learn in a fully distributed manner. Learning approaches such as equilibrium propagation, directed aging, and coupled learning similarly exploit local rules to accomplish learning in physical networks such as mechanical, flow, or electrical networks. In contrast to certain natural neural networks, however, such approaches have so far been restricted to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 19 publications
0
1
0
Order By: Relevance
“…We also must explore going beyond the quasistatic limit as time scales also constrain biological learning systems. Efforts towards this goal have recently been made in silico and in experiments using coupled learning [38]. Similar extensions can be implemented using our algorithm.…”
Section: Discussionmentioning
confidence: 99%
“…We also must explore going beyond the quasistatic limit as time scales also constrain biological learning systems. Efforts towards this goal have recently been made in silico and in experiments using coupled learning [38]. Similar extensions can be implemented using our algorithm.…”
Section: Discussionmentioning
confidence: 99%