2019
DOI: 10.1109/tcad.2018.2852752
|View full text |Cite
|
Sign up to set email alerts
|

HEIF: Highly Efficient Stochastic Computing-Based Inference Framework for Deep Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
31
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 66 publications
(31 citation statements)
references
References 40 publications
0
31
0
Order By: Relevance
“…Two bestperforming examples were extracted for SC-DCNN [61], and HEIF is reported in Ref. [62]. The results for our work are presented at 1 V. The black curve indicates energy/accuracy tradeoff in our system when the maximum isolator length δ (and, consequently, the warm-up time for the isolator buffers to fill) is varied, holding constant the amount of data collected on the network output.…”
Section: Evaluation and Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…Two bestperforming examples were extracted for SC-DCNN [61], and HEIF is reported in Ref. [62]. The results for our work are presented at 1 V. The black curve indicates energy/accuracy tradeoff in our system when the maximum isolator length δ (and, consequently, the warm-up time for the isolator buffers to fill) is varied, holding constant the amount of data collected on the network output.…”
Section: Evaluation and Resultsmentioning
confidence: 99%
“…The origins of stochastic computing lie in the observation that time series data of stochastic spike trains in the brain could be modeled by stochastic jumps from ground to V dd in a logic circuit [58][59][60]. It is no surprise, then, that neural network structures have been implemented successfully and energy efficiently in recent stochastic computing work [61][62][63][64]. Rather than carrying out high level arithmetic and logic operations to "theoretically predict" a neural network's output, stochastic computing implements neuromorphic models of the network in CMOS circuitry.…”
Section: Application To Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations
“…ReLU function is trendy due to its fast computation and solves diminishing return in backward propagation learning during the CNN training stage. However, no SC equivalent circuit existed for that particular function; thus, Li et al (2018a) proposed a novel SC-based ReLU function block as depicted in Fig. 11A.…”
Section: Sc Relu and Sigmoid Activation Layermentioning
confidence: 99%