2022 IEEE 22nd International Conference on Nanotechnology (NANO) 2022
DOI: 10.1109/nano54668.2022.9928590
|View full text |Cite
|
Sign up to set email alerts
|

Towards Efficient RRAM-based Quantized Neural Networks Hardware: State-of-the-art and Open Issues

Abstract: The increasing amount of data processed on edge and demand for reducing the energy consumption for large neural network architectures have initiated the transition from traditional von Neumann architectures towards in-memory computing paradigms. Quantization is one of the methods to reduce power and computation requirements for neural networks by limiting bit precision. Resistive Random Access Memory (RRAM) devices are great candidates for Quantized Neural Networks (QNN) implementations. As the number of possi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(8 citation statements)
references
References 28 publications
(56 reference statements)
0
8
0
Order By: Relevance
“…These non-idealities include endurance issues, conductance drift, problems of stacking at fault values, and non-linearity of the switching curve. However, nonvolatile memories allow multi-level storage, scalability, and high computational density [8]. For example, if the cell size of SRAM is approximately 124F 2 , where F is a technology feature size, RRAM cell with the selector device (1T1R cell) has a size of 12F 2 [9].…”
Section: In-memory Computing Backgroundmentioning
confidence: 99%
See 4 more Smart Citations
“…These non-idealities include endurance issues, conductance drift, problems of stacking at fault values, and non-linearity of the switching curve. However, nonvolatile memories allow multi-level storage, scalability, and high computational density [8]. For example, if the cell size of SRAM is approximately 124F 2 , where F is a technology feature size, RRAM cell with the selector device (1T1R cell) has a size of 12F 2 [9].…”
Section: In-memory Computing Backgroundmentioning
confidence: 99%
“…A. Binarized Neural Networks and QNNs with Ternary Weights Binarized neural network (BNN) is a type of QNN relying on 1-bit weights (+1 and -1 in software) and activations [8]. The fabricated binary SRAM-based IMC designs are shown in [31], [32].…”
Section: In-memory Computing Hardware For Qnnmentioning
confidence: 99%
See 3 more Smart Citations