2023
DOI: 10.1126/sciadv.adg9123
|View full text |Cite
|
Sign up to set email alerts
|

Reconfigurable neuromorphic computing block through integration of flash synapse arrays and super-steep neurons

Abstract: Neuromorphic computing (NC) architecture inspired by biological nervous systems has been actively studied to overcome the limitations of conventional von Neumann architectures. In this work, we propose a reconfigurable NC block using a flash-type synapse array, emerging positive feedback (PF) neuron devices, and CMOS peripheral circuits, and integrate them on the same substrate to experimentally demonstrate the operations of the proposed NC block. Conductance modulation in the flash memory enables the NC block… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 47 publications
(50 reference statements)
0
2
0
Order By: Relevance
“…The integration of NVMs and CMOS electronics provides benefits such as nonvolatility, scalability, direct mapping of synaptic weights, as well as facilitating functions such as data thresholding, conversion, and trimming required for each layer of a neuromorphic DNN 7 , 13 – 18 . There are typical reports on the realization of operations in DNNs using different types of NVMs, such as resistive random-access memory (RRAM) 19 , 20 , phase change memory (PCM) 21 , ferroelectric RAM (FeRAM) 22 , flash memory 23 , and magnetic RAM (MRAM) 24 . While these NVMs show promise for neural network applications, they also come with inherent challenges related to nonlinearity, energy efficiency, area overhead, and reliability 3 .…”
Section: Introductionmentioning
confidence: 99%
“…The integration of NVMs and CMOS electronics provides benefits such as nonvolatility, scalability, direct mapping of synaptic weights, as well as facilitating functions such as data thresholding, conversion, and trimming required for each layer of a neuromorphic DNN 7 , 13 – 18 . There are typical reports on the realization of operations in DNNs using different types of NVMs, such as resistive random-access memory (RRAM) 19 , 20 , phase change memory (PCM) 21 , ferroelectric RAM (FeRAM) 22 , flash memory 23 , and magnetic RAM (MRAM) 24 . While these NVMs show promise for neural network applications, they also come with inherent challenges related to nonlinearity, energy efficiency, area overhead, and reliability 3 .…”
Section: Introductionmentioning
confidence: 99%
“…[3][4][5][6][7] Ideally, a delay-less and energy-efficiency in-memory computing can avoid data-shuttling between the DOI: 10.1002/adfm.202315954 separated processor and the memory, [8][9][10][11] enabling the execution of high-efficiency and in-parallel matrix-vector multiplications (MVM) which is commonly performed in a convolutional neural network (CNN) for image-processing-related tasks. [12][13][14][15][16] There are many emerging memories available for computing, including ferroelectric memories, phase-change memories, resistive memories, magnetoresistive memories, etc., [17][18][19][20][21] among which ferroelectric memories are classified as a 1T1C structured ferroelectric random access memory (FeRAM), a metalferroelectric-insulator-metal structured ferroelectric field-effect transistor (FeFET), and a ferroelectric domain wall memory (DWRAM) relying on high conductivity of erasable ferroelectric domain walls. [22][23][24][25] In comparison, the FeRAM depends a destructive readout process of charge integration resulting in a lower storage density, [26] and the FeFET suffers from the poor polarization retention due to the chemical reaction at the interface between the ferroelectric layer and the underlying semiconductor channel, leading to a significant leakage current and polarization fatigue.…”
Section: Introductionmentioning
confidence: 99%