2016
DOI: 10.1103/physrevx.6.021023
|View full text |Cite
|
Sign up to set email alerts
|

Effect of Heterogeneity on Decorrelation Mechanisms in Spiking Neural Networks: A Neuromorphic-Hardware Study

Abstract: High-level brain function such as memory, classification or reasoning can be realized by means of recurrent networks of simplified model neurons. Analog neuromorphic hardware constitutes a fast and energy efficient substrate for the implementation of such neural computing architectures in technical applications and neuroscientific research. The functional performance of neural networks is often critically dependent on the level of correlations in the neural activity. In finite networks, correlations are typica… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
25
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 23 publications
(28 citation statements)
references
References 109 publications
(201 reference statements)
3
25
0
Order By: Relevance
“…An extension to networks of leaky integrate-and-fire neurons and a theoretical framework for their dynamics and statistics followed in Petrovici et al (2013) and Petrovici et al (2016). The compensation of shared-input correlations through inhibitory feedback and learning was discussed in Bytschok et al (2017), Jordan et al (2017), and Dold et al (2019), inspired by the early study of asynchronous irregular firing in Brunel (2000) and by preceding correlation studies in theoretical (Tetzlaff et al, 2012) and experimental (Pfeil et al, 2016) work.…”
Section: Discussionmentioning
confidence: 99%
“…An extension to networks of leaky integrate-and-fire neurons and a theoretical framework for their dynamics and statistics followed in Petrovici et al (2013) and Petrovici et al (2016). The compensation of shared-input correlations through inhibitory feedback and learning was discussed in Bytschok et al (2017), Jordan et al (2017), and Dold et al (2019), inspired by the early study of asynchronous irregular firing in Brunel (2000) and by preceding correlation studies in theoretical (Tetzlaff et al, 2012) and experimental (Pfeil et al, 2016) work.…”
Section: Discussionmentioning
confidence: 99%
“…It solely relies on the capability of emulating recurrent neural networks, the functionality most neuromorphic-hardware systems are designed for. On the analog neuromorphic system Spikey 67 , for example, it has already been demonstrated that decorrelation by inhibitory feedback is effective and robust, despite large heterogeneity in neuron and synapse parameters and without the need for time-consuming calibrations 68 . While a full neuromorphic-hardware implementation of the framework proposed here is still pending, the demonstration on Spikey shows that our solution is immediately implementable and feasible.…”
Section: Discussionmentioning
confidence: 99%
“…For example, performing first the transformation U X yields ρ 2 = −ρ 2 , ρ 3 = −ρ 3 . Applying subsequently the transformation U 31 , ρ 1 = −ρ 3 = ρ 3 , ρ 3 = ρ 1 = ρ 1 , ρ 2 = ρ 2 = −ρ 2 , results in the transformation (12), as reflected by the corresponding matrix multiplication yielding eq. (11),…”
Section: Quantum Operationsmentioning
confidence: 99%
“…It is straightforward to show that the CNOT-gate cannot be realized on the operator level. A realization on the operator level would require relations of the type (39) on the level of the classical operators B (µν) and A (µν) , which amounts to [S, A (30) ] = [S, A (01) ] = [S, A (31) ] = 0 , SA (11) = A (10) S , SA (10) = A (11) S , SA (21) = A (20) S , SA (20) = A (21) S , SA (33) = A (03) S , SA (03) = A (33) S , SA (32) = A (02) S , SA (02) = A (32) S , SA (22) = −A (13) S , SA (13) = −A (22) S , SA (12) = A (23) S , SA (23) = A (12) S .…”
Section: Cnot-gate In Probabilistic Computingmentioning
confidence: 99%
See 1 more Smart Citation