2020
DOI: 10.3390/e22030256
|View full text |Cite
|
Sign up to set email alerts
|

Thermodynamic Neural Network

Abstract: This work describes a thermodynamically motivated neural network model that self-organizes to transport charge associated with internal and external potentials while in contact with a thermal bath. Isolated networks show multiscale dynamics and evidence of phase transitions, and externally driven networks evolve to efficiently connect external positive and negative potentials. The model implements techniques for rapid, global, reversible, conservative equilibration of node states followed by slow, local, irrev… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
17
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 11 publications
(17 citation statements)
references
References 31 publications
0
17
0
Order By: Relevance
“…It is no coincidence that these algorithms share strong similarities with Machine Learning. The formal analogy between thermodynamics and learning has a long story (Gori, Maggini et al 2016, Goldt and Seifert 2017, Hylton 2020. The literature describes several models that link thermodynamics to the conservation of information (the first law) and the relative entropy decrease (second law).…”
Section: Introductionmentioning
confidence: 99%
See 4 more Smart Citations
“…It is no coincidence that these algorithms share strong similarities with Machine Learning. The formal analogy between thermodynamics and learning has a long story (Gori, Maggini et al 2016, Goldt and Seifert 2017, Hylton 2020. The literature describes several models that link thermodynamics to the conservation of information (the first law) and the relative entropy decrease (second law).…”
Section: Introductionmentioning
confidence: 99%
“…The literature describes several models that link thermodynamics to the conservation of information (the first law) and the relative entropy decrease (second law). These models range over from disordered lattice Ising-spin-glasses to Markovian processes and Neural Networks (Hylton 2020). In these last, kinetic and potential energy concepts interpret learning as a dissipation-driven adaptation mechanism.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations