2009
DOI: 10.1162/neco.2009.08-07-599
|View full text |Cite
|
Sign up to set email alerts
|

Classification of Correlated Patterns with a Configurable Analog VLSI Neural Network of Spiking Neurons and Self-Regulating Plastic Synapses

Abstract: We describe the implementation and illustrate the learning performance of an analog VLSI network of 32 integrate-and-fire neurons with spike-frequency adaptation and 2016 Hebbian bistable spike-driven stochastic synapses, endowed with a self-regulating plasticity mechanism, which avoids unnecessary synaptic changes. The synaptic matrix can be flexibly configured and provides both recurrent and external connectivity with address-event representation compliant devices. We demonstrate a marked improvement in the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
18
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
8
2

Relationship

1
9

Authors

Journals

citations
Cited by 24 publications
(18 citation statements)
references
References 22 publications
(29 reference statements)
0
18
0
Order By: Relevance
“…Finally, only a small number of connections between the transition neurons and the state populations need to be specified or learned to achieve a desired functionality. This property can be useful for framing the design of learning algorithms and for reducing the number of on-chip plastic synapses required to implement autonomous learning of behaviors (7,49,50). The complexity of a learning problem is proportional to the dimensionality of the to-be-learned parameters (51).…”
Section: Discussionmentioning
confidence: 99%
“…Finally, only a small number of connections between the transition neurons and the state populations need to be specified or learned to achieve a desired functionality. This property can be useful for framing the design of learning algorithms and for reducing the number of on-chip plastic synapses required to implement autonomous learning of behaviors (7,49,50). The complexity of a learning problem is proportional to the dimensionality of the to-be-learned parameters (51).…”
Section: Discussionmentioning
confidence: 99%
“…A modified version of the STDP rule is implemented in analog VLSI on the plastic synapses of neuromorphic chips (1, 2) used in this experiment (Giulioni et al, 2009; Mitra et al, 2009). The synaptic update rule (Brader et al, 2007) adjusts the synaptic weight, or efficacy, X upon arrival of a pre-synaptic spike, depending on the instantaneous membrane potential and the internal state of the post-synaptic neuron (Fusi, 2003; Brader et al, 2007).…”
Section: Methodsmentioning
confidence: 99%
“…The neuromorphic engineering community has been building physical models of sWTA networks [109], [110], [111], attractor networks [112], [113], and plasticity mechanisms [91] that cover the full range of temporal and spatial scales described in Section IV-A for many years. For example, several circuit solutions have been proposed to implement short-term plasticity dynamics, using different types of devices and following a wide range of design techniques [114], [115], [116], [117], [118], [119]; a large set of spike-based learning circuits have been proposed to model long-term plasticity [120], [121], [122], [123], [77], [124], [125], [126], [127], [128], [91]; multiple solutions have been proposed for implementing homeostatic plasticity mechanisms [129], [130]; impressive demonstrations have been made showing the properties of VLSI attractor networks [112], [113], [23], [4]; while structural plasticity has been implemented both at the single chip level, with morphology learning mechanisms for dendritic trees [131] and at the system level, in multi-chip systems that transmit spikes using the AER protocol, by reprogramming or "evolving" the network connectivity routing tables stored in the digital communication infrastructure memory banks [132], [133]. While some of these principles and circuits have been adopted in the deep network implementations of Section II and in the large-scale neural network implementations of Section III, many of them still remain to be exploited, at the system and application level, for endowing neuromorphic systems with additional powerful computational primitives.…”
Section: Neuromorphic Circuit Implementationsmentioning
confidence: 99%