ESSCIRC 2022- IEEE 48th European Solid State Circuits Conference (ESSCIRC) 2022
DOI: 10.1109/esscirc55480.2022.9911348
|View full text |Cite
|
Sign up to set email alerts
|

A 1-to-4b 16.8-POPS/W 473-TOPS/mm2 6T-based In-Memory Computing SRAM in 22nm FD-SOI with Multi-Bit Analog Batch-Normalization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 9 publications
0
5
0
Order By: Relevance
“…Color versions of one or more of the figures in this article are available online at http://ieeexplore.ieee.org E datasets (such as CIFAR100, CIFAR10, and MNIST) [13,21,22]. However, some CNN-based industrial applications demand a reasonable classification accuracy on the ImageNet dataset [23].…”
Section: (C) Most Previous Work Evaluate Cnns On Lightweightmentioning
confidence: 99%
See 3 more Smart Citations
“…Color versions of one or more of the figures in this article are available online at http://ieeexplore.ieee.org E datasets (such as CIFAR100, CIFAR10, and MNIST) [13,21,22]. However, some CNN-based industrial applications demand a reasonable classification accuracy on the ImageNet dataset [23].…”
Section: (C) Most Previous Work Evaluate Cnns On Lightweightmentioning
confidence: 99%
“…Inference of widely-used classification CNNs requires signed (or unsigned) format and signed format for data and weights, respectively [26]; signed format for both data and weights is usually needed for a finite impulse response (FIR) filter [5]; unsigned format for both data and weights is appropriate for Gaussian image filtering to gain higher accuracy compared with signed format with the same bit widths [27]. Some previous works have achieved bit-flexible convolution layers with signed/unsigned weights and unsigned inputs by assigning negative factors to the sign-bit CIM columns [13,21,22]. Saurabh Jain et al attempted to implement convolution layers with signed inputs/weights in SRAM-CIM using duplicated word lines, which cannot achieve flexible bit-width of weights.…”
Section: (C) Most Previous Work Evaluate Cnns On Lightweightmentioning
confidence: 99%
See 2 more Smart Citations
“…Another field which shows their advantage is sparse lightweight networks [15], [16], a main category under neural networks. In contrast to digital SRAM-based CIMs, which trade more area and computing time for higher precision [17], [18], [19], [20], analog-mixedsignal SRAM-based CIMs perform small kernel convolution computations in a single cycle, significantly reducing computation time at the cost of minor recognition rate lost [21], [22], [23]. The lightweight network's high parallelism and energy efficiency computing requirements when using small convolution kernels, as well as its high tolerance for computational accuracy, perfectly match the characteristics of analog in-memory computing.…”
Section: Introductionmentioning
confidence: 99%