2019
DOI: 10.1101/846675
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Flexible working memory through selective gating and attentional tagging

Abstract: Working memory is essential for intelligent behavior as it serves to guide behavior of humans and nonhuman primates when task-relevant stimuli are no longer present to the senses. Moreover, complex tasks often require that multiple working memory representations can be flexibly and independently maintained, prioritized, and updated according to changing task demands. Thus far, neural network models of working memory have been unable to offer an integrative account of how such control mechanisms are implemented… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 106 publications
0
9
0
Order By: Relevance
“…A hallmark of WM in the real world is the ability to flexibly respond to unpredictable changes in environmental exigencies. Thus, an important future goal will be to extend the present work to a network with separate modules with different connectivity patterns and governed by different learning rules (e.g., Kruijne et al, 2020; O’Reilly & Frank, 2006), and to a task that requires truly flexible behavior.…”
Section: Discussionmentioning
confidence: 99%
“…A hallmark of WM in the real world is the ability to flexibly respond to unpredictable changes in environmental exigencies. Thus, an important future goal will be to extend the present work to a network with separate modules with different connectivity patterns and governed by different learning rules (e.g., Kruijne et al, 2020; O’Reilly & Frank, 2006), and to a task that requires truly flexible behavior.…”
Section: Discussionmentioning
confidence: 99%
“…A simpler version of this task, where X and Y were relevant only if they directly followed A or B, respectively, and where fewer irrelevant letters occurred in the input, was solved in O'Reilly and Frank, 2006 ; Martinolli et al, 2018 ; Kruijne et al, 2020 through biologically inspired artificial neural network models that were endowed with special working memory modules. Note that for this simpler version no lower-order working memory is needed because one just has to wait for an immediate transition from A to X in the input sequence or for an immediate transition from B to Y.…”
Section: Resultsmentioning
confidence: 99%
“…Each memory slot in the buffer is represented by m i ∈ R M d ×1 (M d is the dimesionality of the memory slot). This component of the action-motor model is inspired by the working memory model proposed in [29]. Concretely, the self-recurrent slot buffer operates according to the following dynamics: where Q i ∈ R M d ×Dz is the ith random projection matrix (sampled from a centered Gaussian distribution in this paper), which means there is one projection matrix per working memory slot.…”
Section: Neural Generative Motor Controlmentioning
confidence: 99%