2021
DOI: 10.48550/arxiv.2106.09177
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Insights into Data through Model Behaviour: An Explainability-driven Strategy for Data Auditing for Responsible Computer Vision Applications

Alexander Wong,
Adam Dorfman,
Paul McInnis
et al.

Abstract: In this study, we take a departure and explore an explainability-driven strategy to data auditing, where actionable insights into the data at hand are discovered through the eyes of quantitative explainability on the behaviour of a dummy model prototype when exposed to data. We demonstrate this strategy by auditing two popular medical benchmark datasets, and discover hidden data quality issues that lead deep learning models to make predictions for the wrong reasons. The actionable insights gained from this exp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 12 publications
0
1
0
Order By: Relevance
“…The proposed TinyDefectNet was audited using an explainability-driven performance validation strategy to gain deeper insights into its decision-making behaviour when conducting visual quality inspection and ensure that its decisions are driven by relevant visual indicators associated with surface defects. In particular, we leverage the quantitative explainability strategy proposed in [18], which has been shown to provide good quantitative explanations that better reflect decision-making processes than other approaches in literature, and has been shown to be effective at not only model auditing [14] but also identifying hidden data issues [19]. An example of an input surface image and the corresponding quantitative explanation corresponding explanation are shown in Figure 2.…”
Section: Explainability-driven Performance Validationmentioning
confidence: 99%
“…The proposed TinyDefectNet was audited using an explainability-driven performance validation strategy to gain deeper insights into its decision-making behaviour when conducting visual quality inspection and ensure that its decisions are driven by relevant visual indicators associated with surface defects. In particular, we leverage the quantitative explainability strategy proposed in [18], which has been shown to provide good quantitative explanations that better reflect decision-making processes than other approaches in literature, and has been shown to be effective at not only model auditing [14] but also identifying hidden data issues [19]. An example of an input surface image and the corresponding quantitative explanation corresponding explanation are shown in Figure 2.…”
Section: Explainability-driven Performance Validationmentioning
confidence: 99%