2019
DOI: 10.1038/s41598-019-51814-5
|View full text |Cite
|
Sign up to set email alerts
|

Impact of Synaptic Device Variations on Classification Accuracy in a Binarized Neural Network

Abstract: Brain-inspired neuromorphic systems (hardware neural networks) are expected to be an energy-efficient computing architecture for solving cognitive tasks, which critically depend on the development of reliable synaptic weight storage (i.e., synaptic device). Although various nanoelectronic devices have successfully reproduced the learning rules of biological synapses through their internal analog conductance states, the sustainability of such devices is still in doubt due to the variability common to all nanoel… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 23 publications
(10 citation statements)
references
References 33 publications
0
10
0
Order By: Relevance
“…On the contrary, an off-chip (ex situ) training strategy is more sensitive to device variations since the training is performed using a software-based neural network where trained synaptic weights need to be transferred to the network [15]. In this case, the off-chip training accuracy can be largely affected if there are nonworking devices or a drift in device conductance programming [16]. Some of these device variations can be readily attributed to imperfections in the fabrication process [17] [18].…”
Section: Resultsmentioning
confidence: 99%
“…On the contrary, an off-chip (ex situ) training strategy is more sensitive to device variations since the training is performed using a software-based neural network where trained synaptic weights need to be transferred to the network [15]. In this case, the off-chip training accuracy can be largely affected if there are nonworking devices or a drift in device conductance programming [16]. Some of these device variations can be readily attributed to imperfections in the fabrication process [17] [18].…”
Section: Resultsmentioning
confidence: 99%
“…This implies that the weight matrix is not saturated and cannot be optimized with the nonlinear weight modulation because of the abrupt weight transition by the occasional extraordinary training sample. These learning behaviors can be further eased by adopting the binarized neural network or off-chip learning method. , …”
Section: Results and Discussionmentioning
confidence: 99%
“…These learning behaviors can be further eased by adopting the binarized neural network or off-chip learning method. 7,53 In addition, the effect of the weight window of the synaptic devices on the recognition rate is analyzed by introducing the V T variation to the weight map obtained at epoch = 50 for both devices as shown in Figure 6d with 20 trials. Since the synaptic current has a nonlinear relation with gate bias unlike two-terminal memristive devices, the current difference according to the V T variation is determined by the parabolic transfer curves in Figure 2b.…”
Section: ■ Results and Discussionmentioning
confidence: 99%
“…2 b. Here, it is preferable to obtain an RRAM with a large on/off ratio, taking into account the case, where each state can be overlapped by the stochastic nature of the ion migration 33 , 34 . Assuming that the number of achievable states is the same in the RRAM, the large on/off ratio ensures a reasonable margin between the states despite the inherent variability.…”
Section: Resultsmentioning
confidence: 99%