2022
DOI: 10.1109/tcad.2021.3082107
|View full text |Cite
|
Sign up to set email alerts
|

MARS: Multimacro Architecture SRAM CIM-Based Accelerator With Co-Designed Compressed Neural Networks

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(5 citation statements)
references
References 33 publications
0
5
0
Order By: Relevance
“…Color versions of one or more of the figures in this article are available online at http://ieeexplore.ieee.org E datasets (such as CIFAR100, CIFAR10, and MNIST) [13,21,22]. However, some CNN-based industrial applications demand a reasonable classification accuracy on the ImageNet dataset [23].…”
Section: (C) Most Previous Work Evaluate Cnns On Lightweightmentioning
confidence: 99%
See 2 more Smart Citations
“…Color versions of one or more of the figures in this article are available online at http://ieeexplore.ieee.org E datasets (such as CIFAR100, CIFAR10, and MNIST) [13,21,22]. However, some CNN-based industrial applications demand a reasonable classification accuracy on the ImageNet dataset [23].…”
Section: (C) Most Previous Work Evaluate Cnns On Lightweightmentioning
confidence: 99%
“…Inference of widely-used classification CNNs requires signed (or unsigned) format and signed format for data and weights, respectively [26]; signed format for both data and weights is usually needed for a finite impulse response (FIR) filter [5]; unsigned format for both data and weights is appropriate for Gaussian image filtering to gain higher accuracy compared with signed format with the same bit widths [27]. Some previous works have achieved bit-flexible convolution layers with signed/unsigned weights and unsigned inputs by assigning negative factors to the sign-bit CIM columns [13,21,22]. Saurabh Jain et al attempted to implement convolution layers with signed inputs/weights in SRAM-CIM using duplicated word lines, which cannot achieve flexible bit-width of weights.…”
Section: (C) Most Previous Work Evaluate Cnns On Lightweightmentioning
confidence: 99%
See 1 more Smart Citation
“…In simple application scenarios, it has nearly the same accuracy as the traditional CNN algorithm [1,2] . Chih et al [75] proposed another solution that uses all-digital CIM to execute MAC operations and has high energy efficiency and throughput. In order to reduce computational costs, Sie et al [76] proposed a software and hardware co-design approach to design MARS.…”
Section: Parametermentioning
confidence: 99%
“…The emerging computing-in-memory (CIM) techniques address these shortcomings by performing the MAC operations directly upon reading the synaptic weights from the memory [5], [6], [7], [8], [9], [10], [11], [12]. The CIM circuit designs usually convert the ANN's digital inputs into analog signals to control the read wordlines (RWLs) of the memory cells whose responses are added in an analog way on the read bitlines (RBLs) to produce the analog MAC results.…”
mentioning
confidence: 99%