Proceedings of the 26th Asia and South Pacific Design Automation Conference 2021
DOI: 10.1145/3394885.3431635
|View full text |Cite
|
Sign up to set email alerts
|

Uncertainty Modeling of Emerging Device based Computing-in-Memory Neural Accelerators with Application to Neural Architecture Search

Abstract: Emerging device-based Computing-in-memory (CiM) has been proved to be a promising candidate for high-energy efficiency deep neural network (DNN) computations. However, most emerging devices suffer uncertainty issues, resulting in a difference between actual data stored and the weight value it is designed to be. This leads to an accuracy drop from trained models to actually deployed platforms. In this work, we offer a thorough analysis of the effect of such uncertainties-induced changes in DNN models. To reduce… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
18
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 18 publications
(18 citation statements)
references
References 29 publications
0
18
0
Order By: Relevance
“…Thus this observation generalizes across different DNN models targeting classification tasks. With this conclusion, the authors of [49] claim that, with any independent and identically distributed Gaussian noise on weight, the output vector of the same input image follows a multi-dimensional Gaussian distribution over different samples of noise.…”
Section: Impact Of Device Variation On Dnn Outputsmentioning
confidence: 96%
See 3 more Smart Citations
“…Thus this observation generalizes across different DNN models targeting classification tasks. With this conclusion, the authors of [49] claim that, with any independent and identically distributed Gaussian noise on weight, the output vector of the same input image follows a multi-dimensional Gaussian distribution over different samples of noise.…”
Section: Impact Of Device Variation On Dnn Outputsmentioning
confidence: 96%
“…After finishing modeling the device variations, we can then investigate the impact of device variations on nvCiM DNN accelerators. A typical study is to evaluate such impact on an accelerator targeting image classification tasks [49]. In this section, we introduce the findings of the authors of [49].…”
Section: Impact Of Device Variation On Dnn Outputsmentioning
confidence: 99%
See 2 more Smart Citations
“…Different strategies have been proposed to tackle these issues. Noise-aware training [4] and uncertainty-aware neural architecture search [10][11][12] aim at fortifying DNNs so that their performance remains mostly unaffected even in the presence of device variations. However, these methods are not economical because they require re-training DNNs from scratch and cannot make use of existing pre-trained models.…”
Section: Introductionsmentioning
confidence: 99%