2021
DOI: 10.1016/j.isci.2021.102919
|View full text |Cite
|
Sign up to set email alerts
|

A geometric framework for understanding dynamic information integration in context-dependent computation

Abstract: Summary The prefrontal cortex (PFC) plays a prominent role in performing flexible cognitive functions and working memory, yet the underlying computational principle remains poorly understood. Here, we trained a rate-based recurrent neural network (RNN) to explore how the context rules are encoded, maintained across seconds-long mnemonic delay, and subsequently used in a context-dependent decision-making task. The trained networks replicated key experimentally observed features in the PFC of rodent a… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
26
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

3
4

Authors

Journals

citations
Cited by 12 publications
(27 citation statements)
references
References 80 publications
1
26
0
Order By: Relevance
“…Fixed-point analysis. Similar to our previous analysis 21,22 , we identified fixed-points or slow points of the performance-optimized PFC-MD model by numerically solving the optimization problem (https://github.com/mattgolub/fixed-point-finder) min 𝐱 j 𝑞(𝐱 j), where 𝑞(𝐱 j) = l−𝐱 j + 𝐖 ,== 𝜙(𝐱 j) + [𝐖 /0 , 𝐕 /0 ] • 𝐮l p (which was not certified by peer review) is the author/funder. All rights reserved.…”
Section: Modeling Phasic Changes In MD Neuronal Firingsupporting
confidence: 87%
See 2 more Smart Citations
“…Fixed-point analysis. Similar to our previous analysis 21,22 , we identified fixed-points or slow points of the performance-optimized PFC-MD model by numerically solving the optimization problem (https://github.com/mattgolub/fixed-point-finder) min 𝐱 j 𝑞(𝐱 j), where 𝑞(𝐱 j) = l−𝐱 j + 𝐖 ,== 𝜙(𝐱 j) + [𝐖 /0 , 𝐕 /0 ] • 𝐮l p (which was not certified by peer review) is the author/funder. All rights reserved.…”
Section: Modeling Phasic Changes In MD Neuronal Firingsupporting
confidence: 87%
“…4a-f ). From a neural trajectory p ( t ), we further defined the time-varying neural velocity by computing the change rate in neural trajectory 21 : v ( t ) = ║ṗ( t )║ ( Fig. 4g ).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Our demonstration of visual grid patterns produced from a CNN-RNN model suggests that the computational principles of grid computation is beyond the velocity-driven path integration task. The RNN has provided a dynamical systems viewpoint for motor movement (Sussilo et al 2015; Michaels et al 2016), path integration (Cueva and Wei 2018; Sorscher et al 2020), information integration (Zhang et al 2021), and predictive representations (Recanatesi et al 2021). We envision that the recurrent dynamics of RNNs can be biologically implemented by a neural substrate in a wide range of cortical networks outside the traditional hippocampus-entorhinal system.…”
Section: Discussionmentioning
confidence: 99%
“…A biological brain is large-scale neuronal network with recurrent connections that performs computations in complex task behaviors. In recent years, recurrent neural networks (RNNs) have been widely used for modeling a wide range of neural circuits, such as the prefrontal cortex (PFC), parietal cortex, and primary motor cortex (M1), in various cognitive and motor tasks [1][2][3][4][5][6][7][8][9]. Varying types of assumptions have been made in different models: leaky integrate-and-fire (LIF) vs. conductance-based compartment model, rate vs. spiking-based model.…”
Section: Introductionmentioning
confidence: 99%