2009
DOI: 10.1142/s1793005709001209
|View full text |Cite
|
Sign up to set email alerts
|

Attractors in Song

Abstract: This paper summarizes our recent attempts to integrate action and perception within a single optimization framework. We start with a statistical formulation of Helmholtz's ideas about neural energy to furnish a model of perceptual inference and learning that can explain a remarkable range of neurobiological facts. Using constructs from statistical physics it can be shown that the problems of inferring the causes of our sensory inputs and learning regularities in the sensorium can be resolved using exactly the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
10
0

Year Published

2010
2010
2020
2020

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 9 publications
(12 citation statements)
references
References 41 publications
2
10
0
Order By: Relevance
“…The effects of stimulus probability, on the other hand, might rely on hierarchically higher expectations about the sequence structure or likelihood of stimuli, induced by learning of the statistical regularities of the sequence. This pattern of results is consistent with a cascade of prediction errors that update predictions at progressively higher levels of the processing hierarchy, as reflected in the hierarchically distinct generators of the early and late components of the evoked response ( Friston and Kiebel, 2009 , Garrido et al., 2009b ). Interestingly, at even later latencies (200–500 msec) repetition and stimulus probability showed an interaction effect, replicating the modulatory effects of expectation observed in fMRI.…”
Section: Empirical Studies Of Repetition Suppression In the Context Osupporting
confidence: 76%
See 2 more Smart Citations
“…The effects of stimulus probability, on the other hand, might rely on hierarchically higher expectations about the sequence structure or likelihood of stimuli, induced by learning of the statistical regularities of the sequence. This pattern of results is consistent with a cascade of prediction errors that update predictions at progressively higher levels of the processing hierarchy, as reflected in the hierarchically distinct generators of the early and late components of the evoked response ( Friston and Kiebel, 2009 , Garrido et al., 2009b ). Interestingly, at even later latencies (200–500 msec) repetition and stimulus probability showed an interaction effect, replicating the modulatory effects of expectation observed in fMRI.…”
Section: Empirical Studies Of Repetition Suppression In the Context Osupporting
confidence: 76%
“…Many known characteristics of repetition suppression emerge in simulations of predictive coding. This has been previously shown in a formal model of an artificial brain perceiving sequences of sensory events, simulated using attractor dynamics ( Friston & Kiebel, 2009 ) ( Fig. 3 ).…”
Section: Models Of Repetition Suppression Based On Predictive Codingsupporting
confidence: 63%
See 1 more Smart Citation
“…Although this represents a considerable challenge using conventional paradigms and neurophysiological techniques, methodological advances in highresolution fMRI, optogenetics, calcium imaging, and serial single-unit recordings at multiple levels of the processing hierarchy are providing powerful new opportunities to trace neural markers of hierarchical PP dynamics. 41,128,158,196,209,210 Pairing these increasingly sophisticated neural assays with anatomical models, computational modeling, and simulations 137,211 will enable researchers to derive fine-grained a priori hypotheses and compare model evidence for variant architectures and also for near-variant ones that share much with the core PP picture but differ in their conceptions of the encoding, flow, or use of prediction errors (e.g., Refs. 2, 10 and 11; see Ref.…”
Section: Discussionmentioning
confidence: 99%
“…This representation is not motivated by sparseness, but by computational efficiency: It replaces the problem of computing the (potentially very high-dimensional) posterior probability density by optimizing the free-energy with respect to a small set of sufficient statistics. This variational Bayesian optimization rests on free-energy minimization [37] and proposes the minimization of prediction error as an explanation for stimulus-evoked transient neuronal responses such as the MMN [3], [63], [81]. The work presented in this paper is a step towards linking models of probabilistic neural coding and inference to neuronal signals that can be measured non-invasively in humans.…”
Section: Discussionmentioning
confidence: 99%