2023
DOI: 10.1016/j.physrep.2023.07.001
|View full text |Cite
|
Sign up to set email alerts
|

The free energy principle made simpler but not too simple

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 55 publications
(13 citation statements)
references
References 120 publications
0
11
0
Order By: Relevance
“…To formulate and solve the problem of the evolutionarily optimal simplest neural network, it is convenient to use the toolkit for the analysis of stochastic equations of the form (1), developed in the context of nonequilibrium statistical physics and also previously applied to the analysis of functioning neural networks (but not their evolutionary optimization). 18,27,28,32,33,35-42…”
Section: Methodsmentioning
confidence: 99%
“…To formulate and solve the problem of the evolutionarily optimal simplest neural network, it is convenient to use the toolkit for the analysis of stochastic equations of the form (1), developed in the context of nonequilibrium statistical physics and also previously applied to the analysis of functioning neural networks (but not their evolutionary optimization). 18,27,28,32,33,35-42…”
Section: Methodsmentioning
confidence: 99%
“…Recently, a powerful technique to analyze such dynamic systems has been proposed, inspired by methods from nonequilibrium statistical physics. [4][5][6][7][8][9][10][11][12][13][14][15][16] Instead of considering individual stochastic trajectories that can be obtained from equations ( 1)-( 4) by numerical integration, consider an ensemble of such systems characterized by a probability distribution function. In general, this function should be time-dependent; however, in many cases, a stationary probability distribution function Pstat(x,r) exists.…”
Section: Model Definitionmentioning
confidence: 99%
“…The formalism outlined in the previous subsection is commonly used in the literature to analyze neural networks dynamics, given that functions fi (and thus, u and Q) are predetermined. [4][5][6][7][8][9][10][11][12][13][14][15][16] In this work, we use this formalism -for the first time, to the best of our knowledge -to address a new problem, namely, the problem of evolutionary optimization of neural networks. A traditional approach to this problem would be to define the specific forms for f1 and f2, for example, as in equations ( 2) and (3), integrate them to get an ensemble of trajectories, and then calculate the evolutionary fitness as defined in a given model as a temporal average over all trajectories.…”
Section: Model Definitionmentioning
confidence: 99%
See 1 more Smart Citation
“…The notion that the brain is a predictive or generative machine has been formalized in neuroscience as predictive coding and active inference (or the free-energy principle) [8][9][10][11][12] .…”
Section: Introductionmentioning
confidence: 99%