2018
DOI: 10.1038/s41598-018-35221-w
|View full text |Cite
|
Sign up to set email alerts
|

In vitro neural networks minimise variational free energy

Abstract: In this work, we address the neuronal encoding problem from a Bayesian perspective. Specifically, we ask whether neuronal responses in an in vitro neuronal network are consistent with ideal Bayesian observer responses under the free energy principle. In brief, we stimulated an in vitro cortical cell culture with stimulus trains that had a known statistical structure. We then asked whether recorded neuronal responses were consistent with variational message passing based upon free energy minimisation (i.e., evi… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

3
44
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1
1

Relationship

2
7

Authors

Journals

citations
Cited by 44 publications
(49 citation statements)
references
References 55 publications
3
44
0
Order By: Relevance
“…How this plays out in psychopathology are main themes of this article. Much of our focus will be on what Friston and collaborators call “structure learning” (Tervo et al., 2016; Friston et al., 2017; Gershman, 2017; Isomura and Friston, 2018), namely, learning the repertoire or narratives that constitute our prior beliefs–or hypotheses–about how our world works, and how these might be influenced therapeutically. Although the FEP applies to these structural priors, getting them right can be a tricky business.…”
Section: Introductionmentioning
confidence: 99%
“…How this plays out in psychopathology are main themes of this article. Much of our focus will be on what Friston and collaborators call “structure learning” (Tervo et al., 2016; Friston et al., 2017; Gershman, 2017; Isomura and Friston, 2018), namely, learning the repertoire or narratives that constitute our prior beliefs–or hypotheses–about how our world works, and how these might be influenced therapeutically. Although the FEP applies to these structural priors, getting them right can be a tricky business.…”
Section: Introductionmentioning
confidence: 99%
“…Importantly, the EGHR only requires such a signal that conveys global information to neurons to achieve learning. Furthermore, a study using in vitro neural networks suggested that neurons perform simple BSS using a plasticity rule that is different from the most basic form of Hebbian plasticity, by which synaptic strengths are updated purely as a product of pre- and postsynaptic activity 75,76 . A candidate implementation of the EGHR can be made for cortical pyramidal cells and inhibitory neurons; the former constituting the EGHR output neurons and encoding the expectations of hidden sources, and the latter constituting the third scalar factor and calculating the nonlinear sum of activity in surrounding pyramidal cells.…”
Section: Discussionmentioning
confidence: 99%
“…From the formal perspective of minimizing free energy – or maximizing model evidence, the hierarchical assembly of parsimonious models of the (prosocial) world through our development can be considered in the light of Bayesian model selection or what is called “structure learning” (Gershman and Niv, 2010; Tervo et al, 2016; Isomura and Friston, 2018). In other words, one can build more comprehensive (deep) generative models that have greater evidence (i.e., accuracy minus complexity) by adding layers or rearranging part objects into more complex (or deeper) wholes.…”
Section: The Emergence Of a Dominant Platonic Personmentioning
confidence: 99%