2022
DOI: 10.1101/2022.03.11.483899
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Synapse-type-specific competitive Hebbian learning forms functional recurrent networks

Abstract: Cortical networks exhibit complex stimulus-response patterns. Previous work has identified the balance between excitatory and inhibitory currents as a central component of cortical computations, but has not considered how the required synaptic connectivity emerges from biologically plausible plasticity rules. Using theory and modeling, we demonstrate how a wide range of cortical response properties can arise from Hebbian learning that is stabilized by the synapse-type-specific competition for synaptic resource… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(15 citation statements)
references
References 193 publications
(404 reference statements)
2
12
0
Order By: Relevance
“…The point at which the feedforward weights converge is fully determined by the covariance matrix of the presynaptic neurons’ activities. Specifically, it has been demonstrated [6, 20, 24] that the fixed points of the weight dynamics are eigenvectors of the modified covariance matrix: where E and I the firing rates of the presynaptic excitatory and inhibitory neurons, respectively, and stands for the average over time, Fig. 1b.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…The point at which the feedforward weights converge is fully determined by the covariance matrix of the presynaptic neurons’ activities. Specifically, it has been demonstrated [6, 20, 24] that the fixed points of the weight dynamics are eigenvectors of the modified covariance matrix: where E and I the firing rates of the presynaptic excitatory and inhibitory neurons, respectively, and stands for the average over time, Fig. 1b.…”
Section: Resultsmentioning
confidence: 99%
“…Input selectivity is a universal attribute of brain networks that is maintained across brain hierarchies, including in brain areas that only receive input from highly recurrent networks. Here, we demonstrated that two ubiquitous features of biological networks, namely internal noise and recurrent connectivity between different sub-networks, can impact the statistics of inputs coming from a population in ways that completely prevent known plasticity mechanisms [6, 20, 21, 24] from forming any kind of input selectivity in neurons found in higher areas.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…These properties make SSNs particularly susceptible to dynamical instabilities resulting in run-away excitation, thus rendering their training highly challenging. Indeed, in the few cases in which the training of SSNs was attempted, either noiseless neurons were used [27, 28], or the network was so heavily under-parameterized that it substantially limited its expressivity [29].…”
Section: Introductionmentioning
confidence: 99%