2020
DOI: 10.1523/eneuro.0374-19.2020
|View full text |Cite
|
Sign up to set email alerts
|

An Indexing Theory for Working Memory Based on Fast Hebbian Plasticity

Abstract: Working memory (WM) is a key component of human memory and cognition. Computational models have been used to study the underlying neural mechanisms, but neglected the important role of short-term memory (STM) and long-term memory (LTM) interactions for WM. Here, we investigate these using a novel multiarea spiking neural network model of prefrontal cortex (PFC) and two parietotemporal cortical areas based on macaque data. We propose a WM indexing theory that explains how PFC could associate, maintain, and upda… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
13
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

3
6

Authors

Journals

citations
Cited by 30 publications
(16 citation statements)
references
References 104 publications
(153 reference statements)
0
13
0
Order By: Relevance
“…The network model used here features two reciprocally connected networks, the so-called Item and Context networks. The architecture of each network follows our previous spiking implementations of attractor memory networks (Lansner, 2009; Tully et al, 2014, 2016; Lundqvist et al, 2011; Fiebig and Lansner, 2017; Chrysanthidis et al, 2019; Fiebig et al, 2020), and is best understood as a subsampled cortical layer 2/3 patch with nested hypercolumns (HCs) and minicolumns (MCs; Fig. 1A, see STAR⋆METHODS for details).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The network model used here features two reciprocally connected networks, the so-called Item and Context networks. The architecture of each network follows our previous spiking implementations of attractor memory networks (Lansner, 2009; Tully et al, 2014, 2016; Lundqvist et al, 2011; Fiebig and Lansner, 2017; Chrysanthidis et al, 2019; Fiebig et al, 2020), and is best understood as a subsampled cortical layer 2/3 patch with nested hypercolumns (HCs) and minicolumns (MCs; Fig. 1A, see STAR⋆METHODS for details).…”
Section: Resultsmentioning
confidence: 99%
“…Recently we demonstrated that the same model, enhanced with a Bayesian-Hebbian learning rule (Bayesian Confidence Propagation Neural Network, BCPNN) to model synaptic and intrinsic plasticity, was able to quantitatively reproduce key behavioral observations from human word-list learning experiments (Fiebig and Lansner, 2017), such as serial order effects during immediate recall. This model performed one-shot memory encoding and was further expanded into a two-area cortical model used to explore a novel indexing theory of working memory, based on fast Hebbian synaptic plasticity (Fiebig et al, 2020). In this context, it was suggested that the underlying naive Bayes view of association would make the associative binding between two items weaker if one of them is later associated with additional items.…”
Section: Introductionmentioning
confidence: 99%
“…Short-term synaptic plasticity has been supported as a candidate mechanism for storing information in WM [ 24 , 25 ]. Moreover, fast-expressing Hebbian synaptic plasticity may modify network connectivity momentarily enough to support information storage in WM [ 74 76 ]. Arguably, Hebbian forms of synaptic plasticity are incompatible with the flexible functionality of WM, because they induce long-lasting changes in synaptic connections that generally outlast the duration of persistent activity.…”
Section: Discussionmentioning
confidence: 99%
“…Importantly, both reduced non-spiking and biologically detailed spiking realizations of BCPNN perform similar functions. They have been extensively used to model brain-like cognitive capabilities such as associative memory (Johansson and Lansner, 2007 ; Lundqvist et al, 2011 ), episodic memory (Chrysanthidis et al, 2021 ), and working memory (Fiebig and Lansner, 2017 ; Fiebig et al, 2020 ), which play a key role in human intelligence. In a broader perspective, we suggest that these advancements in simulating different aspects of human cognitive function within a system framework of brain-like BCPNN constitute a promising direction in the development of artificial general intelligence (AGI).…”
Section: Introductionmentioning
confidence: 99%