2022
DOI: 10.1101/2022.05.19.492731
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Predictive Sequence Learning in the Hippocampal Formation

Abstract: The interaction between hippocampus and cortex is key to memory formation and representation learning. Based on anatomical wiring and transmission delays, we proposed a self-supervised recurrent neural network (PredRAE) with a predictive reconstruction loss to account for the cognitive functions of hippocampus. This framework extends predictive coding in the time axis and incorporates predictive features in Bayes filters for temporal prediction. In simulations, we were able to reproduce characteristic place ce… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 97 publications
0
1
0
Order By: Relevance
“…This past work has shown that: (1) Aspects of spatial tuning in the hippocampus are well-explained by predictive representations 11,55 , which can be learned from spatially tuned inputs with predictive Hebbian learning 79 , TD learning 80 , or spike-timing dependent plasticity 80,81 ; (2) Learning to predict memory embeddings from actions can link allocentric and egocentric representations 82 ; (3) Prediction can link spatially tuned inputs and relational structures 83 ; (4) Cloned hidden-Markov models that use discrete states to predict observations from actions can recapitulate many of the features of the hippocampus and support offline evaluations 84,85 ; (5) Engaging in path integration (i.e. predicting spatial locations from sequences of actions) leads to the emergence of grid cells and continuous attractor dynamics 48,50 ; (6) Training recurrent networks to predict hippocampal spiking data or linear place cell sequences can reproduce various features of hippocampal activity, including spike cross-correlations and sequence replay 86 . Our work demonstrates that these findings can be brought together under the umbrella of sequential predictive learning of sensory data in RNNs, and applied to ego-centric, high-dimensional, continuous sensory inputs.…”
Section: Discussionmentioning
confidence: 99%
“…This past work has shown that: (1) Aspects of spatial tuning in the hippocampus are well-explained by predictive representations 11,55 , which can be learned from spatially tuned inputs with predictive Hebbian learning 79 , TD learning 80 , or spike-timing dependent plasticity 80,81 ; (2) Learning to predict memory embeddings from actions can link allocentric and egocentric representations 82 ; (3) Prediction can link spatially tuned inputs and relational structures 83 ; (4) Cloned hidden-Markov models that use discrete states to predict observations from actions can recapitulate many of the features of the hippocampus and support offline evaluations 84,85 ; (5) Engaging in path integration (i.e. predicting spatial locations from sequences of actions) leads to the emergence of grid cells and continuous attractor dynamics 48,50 ; (6) Training recurrent networks to predict hippocampal spiking data or linear place cell sequences can reproduce various features of hippocampal activity, including spike cross-correlations and sequence replay 86 . Our work demonstrates that these findings can be brought together under the umbrella of sequential predictive learning of sensory data in RNNs, and applied to ego-centric, high-dimensional, continuous sensory inputs.…”
Section: Discussionmentioning
confidence: 99%