2021
DOI: 10.1002/aisy.202000103
|View full text |Cite
|
Sign up to set email alerts
|

Ex Situ Transfer of Bayesian Neural Networks to Resistive Memory‐Based Inference Hardware

Abstract: In recent years, neural network models [1] have demonstrated human-level competency in multiple tasks, such as pattern recognition, [2] game playing, [3] and strategy development. [4] This progress has led to the promise that a new generation of intelligent computing systems could be applied to such high-complexity tasks at the edge. [5] However, the current generation of edge computing hardware cannot support the energetic demands nor the data volume required to train and adapt such neural network models loca… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(14 citation statements)
references
References 30 publications
0
14
0
Order By: Relevance
“…In contrast, here we use cycle-to-cycle variation in the programmability of our MoS 2 memtransistor to generate GRNs. While cycle-to-cycle variation is undesirable for traditional computing, it can be exploited to reduce the design complexity of a BNN accelerator 21 , 22 , 43 . To demonstrate the effect of programming on variation, we use dynamic programming on 40 MoS 2 memtransistors, where we measure the transfer characteristics with different sweep ranges.…”
Section: Resultsmentioning
confidence: 99%
“…In contrast, here we use cycle-to-cycle variation in the programmability of our MoS 2 memtransistor to generate GRNs. While cycle-to-cycle variation is undesirable for traditional computing, it can be exploited to reduce the design complexity of a BNN accelerator 21 , 22 , 43 . To demonstrate the effect of programming on variation, we use dynamic programming on 40 MoS 2 memtransistors, where we measure the transfer characteristics with different sweep ranges.…”
Section: Resultsmentioning
confidence: 99%
“…A key limitation of RRAM devices for hardware ANN implementation is the limited precision due to the program/read variations [68,70]. Such variations can be turned into a precious feature in stochastic computing circuits, e.g., true random number generators [109,110], Bayesian neural networks [108] and Monte Carlo Markov chains [111]. In a Bayesian neural network (figure 11(a)), synaptic parameters usually consist of random variables, which well match the random nature of RRAM conductance obtained without programverify algorithms [108].…”
Section: Acceleration Of Machine Learning Algorithmsmentioning
confidence: 99%
“…Such variations can be turned into a precious feature in stochastic computing circuits, e.g., true random number generators [109,110], Bayesian neural networks [108] and Monte Carlo Markov chains [111]. In a Bayesian neural network (figure 11(a)), synaptic parameters usually consist of random variables, which well match the random nature of RRAM conductance obtained without programverify algorithms [108]. Figure 11(a) shows the methodology for describing a given probability distribution of weights with stochastic RRAMs: the distribution can be approximated by a combination of a number of Gaussian distributions, each obtained by programming RRAM devices with a fixed pulsed amplitude and time, without verify.…”
Section: Acceleration Of Machine Learning Algorithmsmentioning
confidence: 99%
See 1 more Smart Citation
“…Renesas [102] Dynamically Reconfigurable Processor (DRP) technology is special purpose hardware that accelerates image processing algorithms by as much as 10×, or more Recently, resistive memory based Bayesian neural networks have been found applicable in edge inference hardware. In [103], Dalgaty et al propose to use trained Bayesian models with probabilistic programming methods to overcome the inherently random process. Hence, the intrinsic cycle to cycle and device to device conductance variation are limited.…”
Section: Vendormentioning
confidence: 99%