2017 International Joint Conference on Neural Networks (IJCNN) 2017
DOI: 10.1109/ijcnn.2017.7966447
|View full text |Cite
|
Sign up to set email alerts
|

A software-equivalent SNN hardware using RRAM-array for asynchronous real-time learning

Abstract: Abstract-Spiking Neural Network (SNN) naturally inspires hardware implementation as it is based on biology. For learning, spike time dependent plasticity (STDP) may be implemented using an energy efficient waveform superposition on memristor based synapse. However, system level implementation has three challenges. First, a classic dilemma is that recognition requires current reading for short voltagespikes which is disturbed by large voltage-waveforms that are simultaneously applied on the same memristor for r… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(7 citation statements)
references
References 19 publications
0
4
0
Order By: Relevance
“…Analog waveform shaping involves charging/discharging large capacitors followed by signal buffering circuits as drivers [22], [27], [33]. For e.g.…”
Section: A Benefits For Analog Implementationmentioning
confidence: 99%
“…Analog waveform shaping involves charging/discharging large capacitors followed by signal buffering circuits as drivers [22], [27], [33]. For e.g.…”
Section: A Benefits For Analog Implementationmentioning
confidence: 99%
“…The network demonstrates state-of-the-art learning performance with recognition accuracy of 96% (Supplementary Information 2). The hardware translation of this SNN is explored extensively in the literature [25][26][27] . The area and energy/spike of the neuron is benchmarked with literature in Table 1.…”
Section: Performance and Benchmarkingmentioning
confidence: 99%
“…For pattern recognition problems, various SNN architectures are used, which differ in the number of layers and in the way neurons are connected [14,16]. One of the simplest SNN architectures is a two-layer network (Figure 6a), where image information is supplied to the input (first layer), and one of the neurons associated with a certain class of images is activated at the output (second layer) [42][43][44]. Each of the first layer neurons is connected to each neuron of the second layer through excitatory connections.…”
Section: Snn Architecturementioning
confidence: 99%
“…A large number of information coding methods for SNN has been defined: rate coding, rank coding, time to first spike, latency coding, phase coding, population coding, and others [37]. Typically, two-layer neural networks, used to classify images, apply the rate coding method [43,45,46]. However, in the current study, we use the time to the first spike method [37].…”
Section: Snn Trainingmentioning
confidence: 99%