2021
DOI: 10.3389/fncom.2021.630999
|View full text |Cite
|
Sign up to set email alerts
|

A Computational Model of Working Memory Based on Spike-Timing-Dependent Plasticity

Abstract: Working memory is closely involved in various cognitive activities, but its neural mechanism is still under exploration. The mainstream view has long been that persistent activity is the neural basis of working memory, but recent experiments have observed that activity-silent memory can also be correctly recalled. The underlying mechanism of activity-silent memory is considered to be an alternative scheme that rejects the theory of persistent activity. We propose a working memory model based on spike-timing-de… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 50 publications
0
3
0
Order By: Relevance
“…Hebbian changes in synaptic strengths can form memory associations in the brain that are transient and, importantly, can be overwritten in a use-dependent manner ( 21 ). Accordingly, a number of computational models of WM implement Hebbian synaptic plasticity to simulate a range of cognitive memory effects ( 22 25 ). To date, little direct neural evidence has supported these rapidly associative neural codes, partly because methods have not yet been developed to detect and model them.…”
mentioning
confidence: 99%
“…Hebbian changes in synaptic strengths can form memory associations in the brain that are transient and, importantly, can be overwritten in a use-dependent manner ( 21 ). Accordingly, a number of computational models of WM implement Hebbian synaptic plasticity to simulate a range of cognitive memory effects ( 22 25 ). To date, little direct neural evidence has supported these rapidly associative neural codes, partly because methods have not yet been developed to detect and model them.…”
mentioning
confidence: 99%
“…Moreover, this is an important property when the network needs to learn an entirely new input or when the network needs to recover from a partial lesion. Computational models with synaptic plasticity avoid this problem either by all-to-all connectivity (Rolls et al, 2013;Huang and Wei, 2021)-which is computationally demanding on the one hand and not comparable to the brain where connectivity is only sparse on the other hand-or they require the creation of a fixed number of synapses between random neurons (Szatmáry and Izhikevich, 2010;Savin and Triesch, 2014;Fiebig and Lansner, 2017) at the beginning, limiting the ability to form memories between neuron pairs drastically. Our approach overcomes this limitation, allowing the forming of connections between all neurons and influencing the range over which neurons connect by changing our Gaussian probability parameter σ .…”
Section: Comparison Between Homeostatic Engram Formation and Synaptic...mentioning
confidence: 99%
“…Moreover, this is an important property when the network needs to learn an entirely new input or when the network needs to recover from a partial lesion. Computational models with synaptic plasticity avoid this problem either by all-to-all connectivity (Rolls et al, 2013, Huang and Wei, 2021)—which is computationally demanding on the one hand and not comparable to the brain where connectivity is only sparse on the other hand— or they require the creation of a fixed number of synapses between random neurons (Fiebig and Lansner, 2017, Szatmáary and Izhikevich, 2010, Savin and Triesch, 2014) at the beginning, limiting the ability to form memories between neuron pairs drastically. Our approach overcomes this limitation, allowing the forming of connections between all neurons and influencing the range over which neurons connect by changing our Gaussian probability parameter σ .…”
Section: Discussionmentioning
confidence: 99%