2017 International Joint Conference on Neural Networks (IJCNN) 2017
DOI: 10.1109/ijcnn.2017.7965993
|View full text |Cite
|
Sign up to set email alerts
|

Hardware-driven nonlinear activation for stochastic computing based deep convolutional neural networks

Abstract: Abstract-Recently, Deep Convolutional Neural Networks (DCNNs) have made unprecedented progress, achieving the accuracy close to, or even better than human-level perception in various tasks. There is a timely need to map the latest software DCNNs to application-specific hardware, in order to achieve orders of magnitude improvement in performance, energy efficiency and compactness. Stochastic Computing (SC), as a low-cost alternative to the conventional binary computing paradigm, has the potential to enable mass… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
31
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
3
1

Relationship

2
7

Authors

Journals

citations
Cited by 43 publications
(31 citation statements)
references
References 27 publications
(58 reference statements)
0
31
0
Order By: Relevance
“…The survey of logistic, ReLU, and tanh networks is reported in Ref. [65]. Two bestperforming examples were extracted for SC-DCNN [61], and HEIF is reported in Ref.…”
Section: Evaluation and Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The survey of logistic, ReLU, and tanh networks is reported in Ref. [65]. Two bestperforming examples were extracted for SC-DCNN [61], and HEIF is reported in Ref.…”
Section: Evaluation and Resultsmentioning
confidence: 99%
“…The first stochastic computing implementation of a deep convolutional neural network, a particularly important neural network topology with broad applications to image processing, was proposed by Ren et alia [61] and further optimized in Refs. [62,65]. These works draw heavily on new ideas in the stochastic computing literature, including massively parallel generation of pseudorandom bitstreams [66], state-machine based nonlinear activation functions [63,67], and aggressive use of correlation insensitivity [68].…”
Section: Application To Neural Networkmentioning
confidence: 99%
“…Besides multiplications and additions, SC-based activation functions are also developed [19], [20]. As a result, SC has become an interesting and promising approach to implement large-scale neural networks [11], [12], [21], [22] with high performance/energy efficiency and minor accuracy degradation.…”
Section: Stochastic Computing and Scnnsmentioning
confidence: 99%
“…Having hundreds of these modules in a design would require multiple megabits of storage. Indeed, in [11], the authors compare 8-bit neurons ReLU and tanh/sigmoid activation functions. They show that replacing the ReLU with tanh increases the neuron area by 20% and neuron energy by 36%.…”
Section: Introductionmentioning
confidence: 99%