2020
DOI: 10.1101/2020.11.10.350876
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Learning function from structure in neuromorphic networks

Abstract: The connection patterns of neural circuits in the brain form a complex network. Collective signaling within the network manifests as patterned neural activity, and is thought to support human cognition and adaptive behavior. Recent technological advances permit macro-scale reconstructions of biological brain networks. These maps, termed connectomes, display multiple non-random architectural features, including heavy-tailed degree distributions, segregated communities and a densely interconnected core. Yet, how… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

3
15
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 19 publications
(19 citation statements)
references
References 165 publications
3
15
0
Order By: Relevance
“…This points out once again to the importance of random wiring diagrams for ESNs' performance. This fact is as well in line with a recent study using human connectivity as reservoir of ESNs, which showed that random connectivity indeed achieved globally maximal performances across almost all tested hyperparameters, provided the wiring cost is not considered [24]. The functional importance of randomness is also consistent with the fact that stochastic processes play a fundamental role in brain connectivity formation, both at a micro and meso/macro-scale, as supported by empirical [25], and computational modeling studies [26,27].…”
Section: Discussionsupporting
confidence: 89%
See 2 more Smart Citations
“…This points out once again to the importance of random wiring diagrams for ESNs' performance. This fact is as well in line with a recent study using human connectivity as reservoir of ESNs, which showed that random connectivity indeed achieved globally maximal performances across almost all tested hyperparameters, provided the wiring cost is not considered [24]. The functional importance of randomness is also consistent with the fact that stochastic processes play a fundamental role in brain connectivity formation, both at a micro and meso/macro-scale, as supported by empirical [25], and computational modeling studies [26,27].…”
Section: Discussionsupporting
confidence: 89%
“…This points out once again to the importance of random wiring diagrams for ESNs’ performance. This fact is as well in line with a recent study using human connectivity as reservoir of ESNs, which showed that random connectivity indeed achieved globally maximal performances across almost all tested hyperparameters, provided the wiring cost is not considered [24].…”
Section: Discussionsupporting
confidence: 86%
See 1 more Smart Citation
“…Building on this work, we recently proposed that the functional network architecture of the brain can be used to build network coding models -models of brain function that describe the encoding and decoding of task-relevant brain activity constrained by connectivity 49 . Related proposals have also been put forward in the electron microscopy connectomics literature, suggesting that structural wiring diagrams of the brain can inform functional models of biological systems (e.g., the drosophila's visual system or the human brain's intrinsic memory capacity) 47,50,51 . In addition, work in mean-field network models have revealed a direct link between connectivity and computations, finding that low-dimensional connectivity patterns (which also exist in fMRI data 52 ) are useful for performing tasks 53 .…”
Section: Discussionmentioning
confidence: 99%
“…In sum, the network topology pertaining to a plethora of BNNs are in contrast to the handcrafted engineering-driven architecture of ANNs. A recent study examines the effect of constructing RNNs with the empirically discerned topology of the human brain network (Suarez et al, 2020). This study, however, examines a different class of RNNs (echo state networks) than we examine in our approach (Elman networks), and in addition, echo state networks are trained with a different algorithm as the one that is used here (backpropagation-through-time).…”
Section: Introductionmentioning
confidence: 99%