2017 International Conference on ReConFigurable Computing and FPGAs (ReConFig) 2017
DOI: 10.1109/reconfig.2017.8279793
|View full text |Cite
|
Sign up to set email alerts
|

Fault tolerance in neural networks: Neural design and hardware implementation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 46 publications
0
10
0
Order By: Relevance
“…In a similar manner to software neural networks, neuromorphic architectures are densely interconnected with large fan out to other neurons. Consequently, the architecture possesses increased redundancy compared to traditional computing architectures and can be fault tolerant if specifically designed that way, but comes at a computational cost [2][3][4][5]. The way neuromorphic computers "learn" is through the use of self-tuning weights.…”
Section: Background Neuromorphic Computingmentioning
confidence: 99%
See 3 more Smart Citations
“…In a similar manner to software neural networks, neuromorphic architectures are densely interconnected with large fan out to other neurons. Consequently, the architecture possesses increased redundancy compared to traditional computing architectures and can be fault tolerant if specifically designed that way, but comes at a computational cost [2][3][4][5]. The way neuromorphic computers "learn" is through the use of self-tuning weights.…”
Section: Background Neuromorphic Computingmentioning
confidence: 99%
“…As a result, neuromorphic architectures have increased redundancy and a reduction in the memory bandwidth bottleneck compared to traditional computing architectures [2]. While research is limited on how single events affect the relatively new architectures, studies have previously been conducted on realizations in COTs (Commercial off the Shelf), FPGAs (Field Programmable Gate Arrays), GPUs (Graphic Processing Units), and software frameworks [3,[5][6][7]. Several works in the literature analyze errors in neural networks through fault injection simulations [3][4][6][7][8][9], radiation tests [3,[6][7][8], and pulsed-laser tests [10].…”
Section: Background Neuromorphic Computingmentioning
confidence: 99%
See 2 more Smart Citations
“…These techniques can be potentially customized to mitigate faults of NNs, as well; however, with timing, area, or power costs. Also, techniques adapted for NNs are surveyed in [40], such as explicit redundancy, retraining, and modifying learning/inference phases. In this paper, instead of costly fault tolerance operations we present an application-aware fault mitigation in NNs, which does not require to exploit any redundant data bits or additional considerable overheads.…”
Section: Fault Mitigation On Nnsmentioning
confidence: 99%