2017 IEEE International Parallel and Distributed Processing Symposium (IPDPS) 2017
DOI: 10.1109/ipdps.2017.66
|View full text |Cite
|
Sign up to set email alerts
|

When Neurons Fail

Abstract: We view a neural network as a distributed system of which neurons can fail independently, and we evaluate its robustness in the absence of any (recovery) learning phase. We give tight bounds on the number of neurons that can fail without harming the result of a computation. To determine our bounds, we leverage the fact that neural activation functions are Lipschitz-continuous. Our bound is on a quantity, we call the \textit{Forward Error Propagation}, capturing how much error is propagated by a neural network … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 28 publications
(26 citation statements)
references
References 20 publications
0
26
0
Order By: Relevance
“…We use the same model as in [34]. We view a neural network as a distributed system comprised of computing nodes, neurons, and communicating channels, synapses:…”
Section: A Main Componentsmentioning
confidence: 99%
See 4 more Smart Citations
“…We use the same model as in [34]. We view a neural network as a distributed system comprised of computing nodes, neurons, and communicating channels, synapses:…”
Section: A Main Componentsmentioning
confidence: 99%
“…The failure of a synapse is also independent from that of other synapses and neurons. Based on [34], synapses failures can be abstracted as mathematically equivalent to some related neurons' failures, our experimental work focuses therefore on neurons failures.…”
Section: Channelsmentioning
confidence: 99%
See 3 more Smart Citations