2016
DOI: 10.3389/fncom.2016.00093
|View full text |Cite
|
Sign up to set email alerts
|

Structural Plasticity Denoises Responses and Improves Learning Speed

Abstract: Despite an abundance of computational models for learning of synaptic weights, there has been relatively little research on structural plasticity, i.e., the creation and elimination of synapses. Especially, it is not clear how structural plasticity works in concert with spike-timing-dependent plasticity (STDP) and what advantages their combination offers. Here we present a fairly large-scale functional model that uses leaky integrate-and-fire neurons, STDP, homeostasis, recurrent connections, and structural pl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

3
21
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 20 publications
(24 citation statements)
references
References 39 publications
(59 reference statements)
3
21
0
Order By: Relevance
“…The other design tension we encounter arises between learning flexibility and the desirability of pruning candidate synapses for the sake of noise reduction. The noise reduction theme arises quite consistently across computational studies of structural plasticity: we found that candidate synapses are a noise source ( Table 2, rows 10 and 11), Spiess et al (30) found that well-pruned networks are less noisy and therefore learn faster than their unpruned counterparts, and Knoblaugh et al (29) found that networks have greater information capacity per synapse when some synapses can be pruned altogether rather than simply weakened. In our system, the allowance of a limited population of weak exploratory candidates is sufficient to allow for learning flexibility, but not enough to enable rapid learning.…”
Section: Discussionsupporting
confidence: 60%
See 1 more Smart Citation
“…The other design tension we encounter arises between learning flexibility and the desirability of pruning candidate synapses for the sake of noise reduction. The noise reduction theme arises quite consistently across computational studies of structural plasticity: we found that candidate synapses are a noise source ( Table 2, rows 10 and 11), Spiess et al (30) found that well-pruned networks are less noisy and therefore learn faster than their unpruned counterparts, and Knoblaugh et al (29) found that networks have greater information capacity per synapse when some synapses can be pruned altogether rather than simply weakened. In our system, the allowance of a limited population of weak exploratory candidates is sufficient to allow for learning flexibility, but not enough to enable rapid learning.…”
Section: Discussionsupporting
confidence: 60%
“…The major outcome of our comparisons of the LASG, high-candidate, and lowcandidate regimes is that candidate synapses in general and their LASG-driven generation in particular are detrimental to response specificity. This is the result that we would expect, upon considering biologically-observed pruning and previous theoretical work that demonstrates pruning's noise-reduction (29,30) benefits. Specifically, in PING's presence, for all four input orderings, applied to both ISNs, response specificity is higher for both no-LASG conditions than for the LASG condition ( Table 2, rows 3 and 4).…”
Section: Learning-accelerated Spine Generation Overcomes Anterograde supporting
confidence: 56%
“…In [20], Zambrano et al presented an adaptive spiking neurons based network, where the neurons encode information in spike-trains using a form of Asynchronous Pulsed Sigma-Delta coding, and the authors demonstrated that the proposed neuron models based network responds an order of magnitude faster and uses an order of magnitude fewer spikes. In the structural plasticity mechanism which proposed in [21] that demonstrated this plasticity improves the learning speed of SNNs, the authors also took the neural conductance into account, and designed the conductance variation models for excitatory neurons and inhibitory neurons, respectively. In [21], the conductance variation models are defined as negative exponential relationships with time only, and the models are nonlinearly fused the spiking neuron model (leaky integrate-and-fire model, LIF model).…”
Section: Introductionmentioning
confidence: 99%
“…In the structural plasticity mechanism which proposed in [21] that demonstrated this plasticity improves the learning speed of SNNs, the authors also took the neural conductance into account, and designed the conductance variation models for excitatory neurons and inhibitory neurons, respectively. In [21], the conductance variation models are defined as negative exponential relationships with time only, and the models are nonlinearly fused the spiking neuron model (leaky integrate-and-fire model, LIF model).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation