2022
DOI: 10.48550/arxiv.2201.09916
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Input correlations impede suppression of chaos and learning in balanced rate networks

Abstract: Neural circuits exhibit complex activity patterns, both spontaneously and evoked by external stimuli. Information encoding and learning in neural circuits depend on how well time-varying stimuli can control spontaneous network activity. We show that in firing-rate networks in the balanced state, external control of recurrent dynamics, i.e., the suppression of internally-generated chaotic variability, strongly depends on correlations in the input. A unique feature of balanced networks is that, because common ex… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 30 publications
(106 reference statements)
0
2
0
Order By: Relevance
“…Previous theoretical work found a noise-induced suppression of chaos in random neural networks driven by time-varying inputs both in discrete time [28] and continuous time [27, 29, 24, 22, 30, 24]. In previous cases, featuring a mean synaptic input centered in the middle of the high-gain region of the transfer function, suppression of chaos occurs because an increase in the variance drives the network away from the chaotic regime.…”
Section: Discussionmentioning
confidence: 99%
“…Previous theoretical work found a noise-induced suppression of chaos in random neural networks driven by time-varying inputs both in discrete time [28] and continuous time [27, 29, 24, 22, 30, 24]. In previous cases, featuring a mean synaptic input centered in the middle of the high-gain region of the transfer function, suppression of chaos occurs because an increase in the variance drives the network away from the chaotic regime.…”
Section: Discussionmentioning
confidence: 99%
“…Noise-induced enhancement of chaos. Previous theoretical work found a noise-induced suppression of chaos in random neural networks driven by time-varying inputs both in discrete time [28] and continuous time [27,29,24,22,30,24]. In previous cases, featuring a mean synaptic input centered in the middle of the high-gain region of the transfer function, suppression of chaos occurs because an increase in the variance drives the network away from the chaotic regime.…”
Section: Discussionmentioning
confidence: 99%