2023
DOI: 10.1103/physreve.107.025001
|View full text |Cite
|
Sign up to set email alerts
|

Learning to self-fold at a bifurcation

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 49 publications
0
1
0
Order By: Relevance
“…[46][47][48] Recently, employing potential energies of the two states of the system as the two different inputs, contrastive Hebbian learning has inspired the development of completely distributed and physics-driven learning machines. [49][50][51][52][53] In those designs, learning degrees of freedom are updated based on only local conditions. Consequently, the method can be more scalable The active bond contains a strain sensor, a controller, and an actuating element that modulates the stiffness of the bond.…”
Section: Introductionmentioning
confidence: 99%
“…[46][47][48] Recently, employing potential energies of the two states of the system as the two different inputs, contrastive Hebbian learning has inspired the development of completely distributed and physics-driven learning machines. [49][50][51][52][53] In those designs, learning degrees of freedom are updated based on only local conditions. Consequently, the method can be more scalable The active bond contains a strain sensor, a controller, and an actuating element that modulates the stiffness of the bond.…”
Section: Introductionmentioning
confidence: 99%
“…Local rules have also been exploited to train laboratory non-biological mechanical networks to exhibit auxetic behavior [8,9] or proteininspired functions [10][11][12]. Other local rules have been proposed for associative memory [13][14][15][16][17]; one of them has even been demonstrated in the lab [18]. Here we will focus on a powerful set of local rules based on the framework of Contrastive Hebbian Learning, which perform approximate gradient descent of a cost function [19][20][21][22][23][24][25][26].…”
Section: Introductionmentioning
confidence: 99%