2023
DOI: 10.1101/2023.01.19.524711
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Generative Model of Memory Construction and Consolidation

Abstract: Human episodic memories are (re)constructed, combining unique features with schema-based predictions, and share neural substrates with imagination. They also show systematic schema-based distortions that increase with consolidation. Here we present a computational model in which hippocampal replay (from an autoassociative network) trains generative models (variational autoencoders) in neocortex to (re)create sensory experiences via latent variable representations. Simulations using large image datasets reflect… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
4
0
2

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 141 publications
0
4
0
2
Order By: Relevance
“…There exists a range of computational models that simulate replay at different levels of biological detail [91, 92, 19, 93, 94], account for different features of replay [51, 43, 57, 63, 19, 68], and posit distinct functions for replay [95, 96, 63, 19, 60, 97, 66]. Our theory follows in a lineage of memory-focused replay models, showing the power of this perspective in accounting for data that have been assumed to require optimization of value-based predictions.…”
Section: Discussionmentioning
confidence: 99%
“…There exists a range of computational models that simulate replay at different levels of biological detail [91, 92, 19, 93, 94], account for different features of replay [51, 43, 57, 63, 19, 68], and posit distinct functions for replay [95, 96, 63, 19, 60, 97, 66]. Our theory follows in a lineage of memory-focused replay models, showing the power of this perspective in accounting for data that have been assumed to require optimization of value-based predictions.…”
Section: Discussionmentioning
confidence: 99%
“…An alternative explanation for the memory errors that we found is that they are partially due to retrieval mechanisms. Under the assumption that memory retrieval is reconstructive (Bartlett, 1932;Schacter, 2012) and reflects a combination of event-specific details, as well as more schematic gist-like information (Bromis et al, 2021;Spens & Burgess, 2023), then plausible endings for the videos in the incomplete condition might be generated on-the-fly during the recall FALSE MEMORIES tests. While the specific content of the falsely recalled endings would be based on familiarity with the particular situations depicted in each video, the over-arching driver of the memory errors would be the expectation that events typically have coherent endpoints.…”
Section: Discussionmentioning
confidence: 99%
“…Another approach has been a hybrid approach [ 71 ], using a relatively standard biologically plausible neuronal network approach to modelling the hippocampus that includes an autoassociation network [ 3 , 15 , 17 ], implemented with a modern Hopfield network [ 72 ], but combining this with generative models (variational autoencoders) for the connectivity between the neocortex and the hippocampus. A problem again with this approach is that the generative autoencoder requires deep learning, and is impenetrable in exactly what is computed at different levels of the network [ 2 , 73 ].…”
Section: Ai-based Approaches To Understanding Hippocampal Memory Func...mentioning
confidence: 99%