2020
DOI: 10.1007/s11265-020-01558-7
|View full text |Cite|
|
Sign up to set email alerts
|

Verification and Design Methods for the BrainScaleS Neuromorphic Hardware System

Abstract: This paper presents verification and implementation methods that have been developed for the design of the BrainScaleS-2 65 nm ASICs. The 2nd generation BrainScaleS chips are mixed-signal devices with tight coupling between full-custom analog neuromorphic circuits and two general purpose microprocessors (PPU) with SIMD extension for on-chip learning and plasticity. Simulation methods for automated analysis and pre-tapeout calibration of the highly parameterizable analog neuron and synapse circuits and for hard… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
30
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 33 publications
(31 citation statements)
references
References 36 publications
0
30
0
Order By: Relevance
“…PyNN officially supports BRIAN, NEST and NEURON SNN simulation tools. It is also supported on SpiNNaker [51] and BrainScaleS-2 [52] neuromorphic hardware systems. There are several more simulation tools which work with PyNN.…”
Section: Snn Simulation Tools and Hardware Acceleratorsmentioning
confidence: 99%
See 1 more Smart Citation
“…PyNN officially supports BRIAN, NEST and NEURON SNN simulation tools. It is also supported on SpiNNaker [51] and BrainScaleS-2 [52] neuromorphic hardware systems. There are several more simulation tools which work with PyNN.…”
Section: Snn Simulation Tools and Hardware Acceleratorsmentioning
confidence: 99%
“…The BrainScaleS-2 [52] is a mixed-signal accelerated neuromorphic system with analog neural core, digital connectivity along with embedded SIMD microprocessor. It is efficient for emulations of neurons, synapses, plasticity models etc.…”
Section: Snn Simulation Tools and Hardware Acceleratorsmentioning
confidence: 99%
“…In machine learning, algorithmic top-down analysis of the gradient descent demonstrated how local eligibility traces at synapses allow networks to reach performances comparable to error back-propagation algorithm on complex tasks [17]- [19]. Examples of neuromorphic platforms that implement these types of eligibility traces in spiking neural networks already exist [20]- [22]. However, learning in these platforms is only supported through the use of von-Neumann processors, either shared with the computation of network dynamics [21] or a dedicated core [20], [22].…”
Section: Introductionmentioning
confidence: 99%
“…Examples of neuromorphic platforms that implement these types of eligibility traces in spiking neural networks already exist [20]- [22]. However, learning in these platforms is only supported through the use of von-Neumann processors, either shared with the computation of network dynamics [21] or a dedicated core [20], [22]. Relying on numerical integration, these platforms do not leverage the physics of their computing substrate and are not free from the von-Neumann bottleneck problem [23], [24].…”
Section: Introductionmentioning
confidence: 99%
“…In current technological implementations is already 243 noticed that the advantages of using central synchronization are lost, so recently, the idea 244 of asynhronous operation was proposed [30]. In our current technologies, the dispersion is 245 not negligible even within processors, and especially not in computing systems, from the 246 several centimeter long buses in our PCs [11] through the wafer-scale systems [31] to the 247 hundred meter long cables in supercomputers [13].…”
mentioning
confidence: 99%