2022
DOI: 10.1007/978-3-031-04580-6_27
|View full text |Cite
|
Sign up to set email alerts
|

Mitigating the Effects of RRAM Process Variation on the Accuracy of Artificial Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 24 publications
0
2
0
Order By: Relevance
“…Therefore, unlike the quantization constraint, we should closely control the write variations in any future design for an acceptable basecaller. Fortunately, some previous works [22,37,100] propose mitigation techniques that, when combined, can provide us with reasonable (e.g., amount of ≤ 10%) write variation. From now on, we consider only up to 10% write variation (as defined in Section 2.3) in our evaluations.…”
Section: Accuracy For Write Variationsmentioning
confidence: 99%
“…Therefore, unlike the quantization constraint, we should closely control the write variations in any future design for an acceptable basecaller. Fortunately, some previous works [22,37,100] propose mitigation techniques that, when combined, can provide us with reasonable (e.g., amount of ≤ 10%) write variation. From now on, we consider only up to 10% write variation (as defined in Section 2.3) in our evaluations.…”
Section: Accuracy For Write Variationsmentioning
confidence: 99%
“…Mehmood et al [39], [40] presented an energy-efficient fault-tolerant scheme to upgrade the accuracy of WBAN. Fritscher et al [41] showed how a faultaware training act on the reaction of a network changeability. In the present article, we studied the properties of faulttolerant resolving sets and fault-tolerant metric dimensions of three r-level wheel related networks r-level anti-web wheel, gear and Helm network.…”
Section: Introductionmentioning
confidence: 99%