2015 IEEE International Electron Devices Meeting (IEDM) 2015
DOI: 10.1109/iedm.2015.7409716
|View full text |Cite
|
Sign up to set email alerts
|

NVM neuromorphic core with 64k-cell (256-by-256) phase change memory synaptic array with on-chip neuron circuits for continuous in-situ learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
102
0
1

Year Published

2016
2016
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 169 publications
(104 citation statements)
references
References 7 publications
1
102
0
1
Order By: Relevance
“…By using the dummy weight, we can make the absolute values of β ± identical (β = 0), and we define Also, according to Eqs. (12) and (13), the absolute values of θ + and θ − can be the same, and θ + + θ − = 0. Therefore, Eq.…”
Section: Time-domain Weighted-sum Calculation With Different-signed Wmentioning
confidence: 93%
See 1 more Smart Citation
“…By using the dummy weight, we can make the absolute values of β ± identical (β = 0), and we define Also, according to Eqs. (12) and (13), the absolute values of θ + and θ − can be the same, and θ + + θ − = 0. Therefore, Eq.…”
Section: Time-domain Weighted-sum Calculation With Different-signed Wmentioning
confidence: 93%
“…Therefore, in the PoC circuit, all MOSFETs have the same ON resistance with the same gate voltage. To realize different analog weights, it is necessary to use analog memory devices such as resistance-change memory [27,12], ferroelectric-gate FETs [17], and floating-gate flash memory [1].…”
Section: Circuits and Architectures For Tact-based Neural Networkmentioning
confidence: 99%
“…In the rst approach, an articial neural network (ANN) is trained by supervised learning, e.g., the backpropagation (BP) algorithm, [14][15][16][17][18] to construct a hardware accelerator for DNNs. [19][20][21] The major issue for this approach is the non-linear weight update and the large variability of resistive switching devices. 20,22 On the other hand, brain-inspired spiking neural networks (SNNs) aim at replicating the brain structure and computation in hardware.…”
Section: -13mentioning
confidence: 99%
“…[19][20][21] The major issue for this approach is the non-linear weight update and the large variability of resistive switching devices. 20,22 On the other hand, brain-inspired spiking neural networks (SNNs) aim at replicating the brain structure and computation in hardware. Learning usually takes place via spike-timing dependent plasticity (STDP), [23][24][25][26][27] where synapses can update their weight according to the timing between spikes of the pre-synaptic neuron (PRE) and post-synaptic neuron (POST).…”
Section: -13mentioning
confidence: 99%
“…Monolithic 3-dimensional integration of these nonvolatile memories with CMOS, demonstrated in [24,25], allows designers to hide the logic circuitry underneath multiple layers of synapses, reducing silicon cost and increasing synapse density [4]. Gradual resistance change of these devices has been utilized as a synapse for variety of algorithms [26][27][28][29][30][31][32][33]. These devices can implement variations of biological [18] learning rules within a single device, further suggesting their use for neuromorphic hardware.…”
mentioning
confidence: 99%