2018
DOI: 10.1016/j.neuron.2018.07.003
|View full text |Cite
|
Sign up to set email alerts
|

Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks

Abstract: Large-scale neural recordings have established that the transformation of sensory stimuli into motor outputs relies on low-dimensional dynamics at the population level, while individual neurons exhibit complex selectivity. Understanding how low-dimensional computations on mixed, distributed representations emerge from the structure of the recurrent connectivity and inputs to cortical networks is a major challenge. Here, we study a class of recurrent network models in which the connectivity is a sum of a random… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

10
459
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 286 publications
(501 citation statements)
references
References 55 publications
(129 reference statements)
10
459
1
Order By: Relevance
“…Asymmetric couplings have been previously used to generate specific temporal dynamics, though in the absence of noise. Examples include models of temporal sequences [27,28] or recurrent networks within the echo-state/reservoir computing framework [78][79][80][81]. All these models are fundamentally different from ours, as their low-rank structure is fixed and time-independent, hence their temporal dynamics entails no trial-to-trial variability.…”
Section: Correlated Variability In Sensory Vs Motor Processingmentioning
confidence: 99%
“…Asymmetric couplings have been previously used to generate specific temporal dynamics, though in the absence of noise. Examples include models of temporal sequences [27,28] or recurrent networks within the echo-state/reservoir computing framework [78][79][80][81]. All these models are fundamentally different from ours, as their low-rank structure is fixed and time-independent, hence their temporal dynamics entails no trial-to-trial variability.…”
Section: Correlated Variability In Sensory Vs Motor Processingmentioning
confidence: 99%
“…To investigate which factors influence learning within and outside of the neural manifold, we used a recurrent neural network trained with a machine learning algorithm. Although this method of modelling neural dynamics lacks biological details, using RNNs has been surprisingly useful to understand neural phenomena [Sussillo et al, 2015, Barak, 2017, Mastrogiuseppe and Ostojic, 2018, Michaels et al, 2019, Masse et al, 2019. Using this approach, we could identify feedback learning as a potential bottleneck differentiating between learning within versus outside the original manifold.…”
Section: Discussionmentioning
confidence: 99%
“…One potential reason is that most experimental designs are inherently low-dimensional, and therefore bias the observed neural activity [Gao et al, 2017]. Alternatively, it has been shown that low-dimensional dynamics can arise from structured connectivity within the network [Sussillo and Abbott, 2009, Laje and Buonomano, 2013, Hu et al, 2014, Hennequin et al, 2014, Aljadeff et al, 2016, Rivkind and Barak, 2017, Mastrogiuseppe and Ostojic, 2018, DePasquale et al, 2018, Recanatesi et al, 2019.…”
Section: Introductionmentioning
confidence: 99%
“…where the u (i) and v (i) are N -dimensional vectors. Connectivity matrices of this form are called low-rank connectivities (Mastrogiuseppe and Ostojic, 2018;Hopfield, 1982). They belong to the broader class of non-normal matrices (Trefethen and Embree (2005); Murphy and Miller (2009); Goldman (2009); Hennequin et al (2012); see Methods) and can produce amplified transient dynamics in responses to specific stimuli.…”
Section: A Recurrent Network Model For Off Responses: a Dynamical Sysmentioning
confidence: 99%
“…This feature is preserved even when the fitting is performed on a smaller subset of units (subsampling 50% and 20% of the units). In A.-C. we used a recurrent connectivity J = J low−rank + gχ, given by the sum of a low-rank connectivity (with rank 30) and a Gaussian connectivity with mean zero and standard deviation g/ √ N (with g = 0.2), which acts as connectivity noise (Mastrogiuseppe and Ostojic, 2018;Bondanelli and Ostojic, 2018). 20 OFF responses were generated by setting the state before stimulus offset along the first 20 amplified initial conditions.…”
Section: Recurrent Modelmentioning
confidence: 99%