2019
DOI: 10.1101/787911
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A general principle of dendritic constancy – a neuron’s size and shape invariant excitability

Abstract: Highlights• A simple equation that predicts voltage in response to distributed synaptic inputs.• Responses to distributed and clustered inputs are largely independent of dendritic length.• Spike rates in various Hodgkin Huxley (HH) like or Leaky Integrate-and-Fire (LIF) models are largely independent of morphology.• Precise spike timing (firing pattern) depends on dendritic morphology.• NeuroMorpho.Org database-wide analysis of the relation between dendritic morphology and electrophysiology.• Our equations set… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
30
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
3

Relationship

5
1

Authors

Journals

citations
Cited by 10 publications
(30 citation statements)
references
References 88 publications
0
30
0
Order By: Relevance
“…Cuntz et al (2019) [ 23 ] have shown that neuronal excitability in response to distributed synaptic input is invariant of size. This invariance is exact for a homogeneous passive cable and holds approximately for realistic heterogeneities in dendritic diameter, topology, input dynamics, and active properties.…”
Section: Resultsmentioning
confidence: 99%
“…Cuntz et al (2019) [ 23 ] have shown that neuronal excitability in response to distributed synaptic input is invariant of size. This invariance is exact for a homogeneous passive cable and holds approximately for realistic heterogeneities in dendritic diameter, topology, input dynamics, and active properties.…”
Section: Resultsmentioning
confidence: 99%
“…Conversely, larger cells typically have 18 lower input resistances, due to the increased spatial extent and membrane surface area, meaning that larger 19 synaptic currents are necessary to induce the same voltage response and so bring a neuron to threshold (Rall,20 1957; Mainen & Sejnowski, 1996). It has recently been shown by Cuntz et al (2019) that these two phenomena 21 cancel each other exactly: the excitability of neurons receiving distributed excitatory synaptic inputs is largely 22 invariant to changes in size and morphology. In addition, neurons possess several active mechanisms to help 23 maintain firing-rate homeostasis through both synaptic plasticity regulating inputs (Abbott & Nelson, 2000;24 Royer & Paré, 2003) and changes in membrane conductance regulating responses (Gorur-Shandilya et al, 2019).…”
mentioning
confidence: 99%
“…We have shown that normalising afferent synaptic weights by their number (L 0 -normalisation), in a manner 168 similar to real neurons (Cuntz et al, 2019), improves the learning performance of sparse artificial neural networks 169 with a variety of different structures. Such dendritic normalisation constrains the weights and expected inputs to 170 be within relatively tight bands, potentially making better use of available neuronal resources.…”
mentioning
confidence: 99%
See 2 more Smart Citations