2021
DOI: 10.48550/arxiv.2112.04036
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

DeepDiagnosis: Automatically Diagnosing Faults and Recommending Actionable Fixes in Deep Learning Programs

Abstract: Deep Neural Networks (DNNs) are used in a wide variety of applications. However, as in any software application, DNN-based apps are afflicted with bugs. Previous work observed that DNN bug fix patterns are different from traditional bug fix patterns. Furthermore, those buggy models are non-trivial to diagnose and fix due to inexplicit errors with several options to fix them. To support developers in locating and fixing bugs, we propose DeepDiagnosis, a novel debugging approach that localizes the faults, report… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 14 publications
0
1
0
Order By: Relevance
“…Keeper [30] is a tool for testing software using cognitive ML APIs that utilizes pseudo-inverse functions, symbolic execution, and automatic generation of test inputs to fulfill branch coverage and identify root causes of bugs. DeepDiagnosis [32] is an approach for debugging DNN programs by identifying and localizing faults, such as an exploding tensor or loss not decreasing during training, as well as suggesting fixes for the root causes of the faults. Another work [28] proposes enabling ML libraries to search for hyperparameter configurations that encourage learning a fair model, given the dataset.…”
Section: Machine Learning Librariesmentioning
confidence: 99%
“…Keeper [30] is a tool for testing software using cognitive ML APIs that utilizes pseudo-inverse functions, symbolic execution, and automatic generation of test inputs to fulfill branch coverage and identify root causes of bugs. DeepDiagnosis [32] is an approach for debugging DNN programs by identifying and localizing faults, such as an exploding tensor or loss not decreasing during training, as well as suggesting fixes for the root causes of the faults. Another work [28] proposes enabling ML libraries to search for hyperparameter configurations that encourage learning a fair model, given the dataset.…”
Section: Machine Learning Librariesmentioning
confidence: 99%