Proceedings of 2010 IEEE International Symposium on Circuits and Systems 2010
DOI: 10.1109/iscas.2010.5536970
|View full text |Cite
|
Sign up to set email alerts
|

A wafer-scale neuromorphic hardware system for large-scale neural modeling

Abstract: Modeling neural tissue is an important tool to investigate biological neural networks. Until recently, most of this modeling has been done using numerical methods. In the European research project "FACETS" this computational approach is complemented by different kinds of neuromorphic systems. A special emphasis lies in the usability of these systems for neuroscience. To accomplish this goal an integrated software/hardware framework has been developed which is centered around a unified neural system description… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
494
0
3

Year Published

2013
2013
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 578 publications
(509 citation statements)
references
References 11 publications
1
494
0
3
Order By: Relevance
“…Finally, only a small number of connections between the transition neurons and the state populations need to be specified or learned to achieve a desired functionality. This property can be useful for framing the design of learning algorithms and for reducing the number of on-chip plastic synapses required to implement autonomous learning of behaviors (7,49,50). The complexity of a learning problem is proportional to the dimensionality of the to-be-learned parameters (51).…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Finally, only a small number of connections between the transition neurons and the state populations need to be specified or learned to achieve a desired functionality. This property can be useful for framing the design of learning algorithms and for reducing the number of on-chip plastic synapses required to implement autonomous learning of behaviors (7,49,50). The complexity of a learning problem is proportional to the dimensionality of the to-be-learned parameters (51).…”
Section: Discussionmentioning
confidence: 99%
“…So far, research emphasis has been on developing largescale neural-like electronic systems (6)(7)(8). However, the concepts and methods for installing the dynamics necessary to express cognitive behaviors on these substrates are relatively poorly developed, mainly because the deep question of how biological brains install cognition on their neural networks remains open.…”
mentioning
confidence: 99%
“…Traditionally, electronic techniques have been mainly used for neuromorphic systems yielding already platforms such as Neurogrid at Stanford [2], TrueNorth at IBM [3], HICANN at the University of Heidelberg [4] and the University of Manchester s neuromorphic chip Electronic techniques however face important challenges, e.g. limited bandwidth, large multicasting and communication issues, which ultimately impose performance limits [6].…”
mentioning
confidence: 99%
“…Since Spikey also supports on-chip short-term plasticity (STP) and spike-timing dependent synaptic plasticity (STDP), it is predestined for high-performance spike-based computing. Spikey is the predecessor of a large, wafer-scale system currently developed in Heidelberg [29], which is aimed at large-scale neuromorphic computing for accelerated biological simulation and data mining. Figure 1 shows a schematic overview of the Spikey system.…”
Section: Methodological Background the Spikey Neuromorphic Hardware Smentioning
confidence: 99%
“…Several platforms for neuromorphic computing have emerged, starting with the pioneering work of Carver Mead [23], up to rather recent developments like e.g. SpiNNaker [24], NeuroGrid [25], ROLLS [26], IBM's TrueNorth [27], or the systems developed at the University of Heidelberg [28], [29] (see [30] for a review). Aside from testing principles of neural computation [28], [31], [32], initial applications of neuromorphic hardware have been demonstrated for generic pattern recognition [27], [33].…”
Section: Introductionmentioning
confidence: 99%