2022
DOI: 10.1002/aisy.202200034
|View full text |Cite
|
Sign up to set email alerts
|

Pattern Training, Inference, and Regeneration Demonstration Using On‐Chip Trainable Neuromorphic Chips for Spiking Restricted Boltzmann Machine

Abstract: A fully silicon‐integrated restricted Boltzmann machine (RBM) with an event‐driven contrastive divergence (eCD) training algorithm is implemented using novel stochastic leaky integrate‐and‐fire (LIF) neuron circuits and six‐transistor/2‐PCM‐resistor (6T2R) synaptic unit cells on 90 nm CMOS technology. To elaborate, designed a bidirectional, asynchronous, and parallel pulse‐signaling scheme over an analog‐weighted phase‐change memory (PCM) synapse array to enable spike‐timing‐dependent plasticity (STDP) as a lo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 37 publications
0
4
0
Order By: Relevance
“…supported the detailed experiment result of the previous study [ 221 ] and successfully performed the MNIST test and image reconstruction based on on‐chip training. [ 264 ]…”
Section: System‐level Realizationmentioning
confidence: 99%
See 1 more Smart Citation
“…supported the detailed experiment result of the previous study [ 221 ] and successfully performed the MNIST test and image reconstruction based on on‐chip training. [ 264 ]…”
Section: System‐level Realizationmentioning
confidence: 99%
“…Shin et al supported the detailed experiment result of the previous study [221] and successfully performed the MNIST test and image reconstruction based on on-chip training. [264] Figure 15. a) Two passive synapse arrays and OPAMP neurons are integrated on PCB separately.…”
Section: Fully Hardware-integrated On-chipmentioning
confidence: 99%
“…Thus, the SNN can imitate timedependent biological neural reactions, such as leaky integrateand-fire (LIF) or local learning rules including STDP. 40 Additionally, due to the SNN information being delivered by spikes and generating the output whenever it overcomes the internal threshold value, the SNN exhibits a sparse network state, which has the advantage of low-power consumption in hardware implementation. 41 The RBM is a neural network that simplifies the learning computation of Boltzmann machines.…”
mentioning
confidence: 99%
“…An SNN closely mimics a biological neural network, and unlike a deep neural network (DNN), it contains temporal information. Thus, the SNN can imitate time-dependent biological neural reactions, such as leaky integrate-and-fire (LIF) or local learning rules including STDP . Additionally, due to the SNN information being delivered by spikes and generating the output whenever it overcomes the internal threshold value, the SNN exhibits a sparse network state, which has the advantage of low-power consumption in hardware implementation .…”
mentioning
confidence: 99%