2022
DOI: 10.1101/2022.06.23.497415
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Dynamic Predictive Coding: A Model of Hierarchical Sequence Learning and Prediction in the Neocortex

Abstract: We introduce dynamic predictive coding, a new hierarchical model of spatiotemporal prediction and sequence learning in the cortex. The model assumes that higher cortical levels modulate the temporal dynamics of lower levels, correcting their predictions of dynamics using precision-weighted prediction errors. We tested this model using a two-level neural network, where the top-down modulation is implemented as a low-dimensional mixture of possible temporal dynamics. When trained on natural videos, the first-lev… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
28
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 15 publications
(38 citation statements)
references
References 122 publications
(355 reference statements)
2
28
0
Order By: Relevance
“…The lower-level state neurons (at level i) maintain an estimate of the lower-level state s i t,τ where τ denotes a time step at the lower level within the higher level time interval given by t. In the two-level network, this lower-level state makes a prediction of the input via a "decoder" network D. In the simplest case where D is a linear matrix U , this lowest level of APC is equivalent to the generative model using in sparse coding (I = U s where s is sparse [49]). More generally, D can be a 1-layer RELU network [48] or a multi-layer decoder [11,12].…”
Section: Modulation Of State Network By Feedback To Model Complex Env...mentioning
confidence: 99%
See 3 more Smart Citations
“…The lower-level state neurons (at level i) maintain an estimate of the lower-level state s i t,τ where τ denotes a time step at the lower level within the higher level time interval given by t. In the two-level network, this lower-level state makes a prediction of the input via a "decoder" network D. In the simplest case where D is a linear matrix U , this lowest level of APC is equivalent to the generative model using in sparse coding (I = U s where s is sparse [49]). More generally, D can be a 1-layer RELU network [48] or a multi-layer decoder [11,12].…”
Section: Modulation Of State Network By Feedback To Model Complex Env...mentioning
confidence: 99%
“…This episodic memory can later be retrieved when given an internal or external cue, e.g., a partial input that is the beginning of the episodic sequence. Figure 10B (from [48]) provides an example of such a recall by a two-level memory-augmented dynamic predictive coding model (APC without actions) which was shown 5 episodes of a sequence depicting the digit "5" moving from left to right (top panel in Figure 10B). By storing the sequence information as a vector m in its associative memory, the model was later able to retrieve the entire sequence given only the starting image as a cue (Figure 10B (lower panels)): this is reflected in the similarity between the activation pattern of the network's lower-level neurons during recall ("Start" condition) and the activation pattern observed during training ("Conditioning").…”
Section: Cortical Predictive Coding Hippocampal Binding and Episodic ...mentioning
confidence: 99%
See 2 more Smart Citations
“…To acquire transformation-tolerance from temporal continuity, input sequences are required. Most predictive coding models so far, however, either operate on static inputs (5,18,41) or use non-local learning rules (42) such as backpropagation (43, 44) and biologically implausible LSTM units (16, 45). Here, we train multilayered predictive coding networks on transformation sequences with purely Hebbian learning (based on (5) and (18)).…”
Section: Introductionmentioning
confidence: 99%