2021
DOI: 10.48550/arxiv.2110.07682
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Sound and Complete Neural Network Repair with Minimality and Locality Guarantees

Abstract: We present a novel methodology for repairing neural networks that use ReLU activation functions. Unlike existing methods that rely on modifying the weights of a neural network which can induce a global change in the function space, our approach applies only a localized change in the function space while still guaranteeing the removal of the buggy behavior. By leveraging the piecewise linear nature of ReLU networks, our approach can efficiently construct a patch network tailored to the linear region where the b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 16 publications
0
1
0
Order By: Relevance
“…NNrepair [161] uses fault localisation to identify suspicious weights, and then uses constraint solving to modify them marginally. There are provable model repairs like PRDNN [162], REASSURE [163], and Minimal Modifications of DNNs (MMDNN) [164]. However, these approaches are not scalable to large DNNs, only support ReLU activation, and often do not support polytope repair or multilayer repairs.…”
Section: Model Repairsmentioning
confidence: 99%
“…NNrepair [161] uses fault localisation to identify suspicious weights, and then uses constraint solving to modify them marginally. There are provable model repairs like PRDNN [162], REASSURE [163], and Minimal Modifications of DNNs (MMDNN) [164]. However, these approaches are not scalable to large DNNs, only support ReLU activation, and often do not support polytope repair or multilayer repairs.…”
Section: Model Repairsmentioning
confidence: 99%