2020
DOI: 10.48550/arxiv.2004.07511
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Explainable Image Classification with Evidence Counterfactual

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 0 publications
0
10
0
Order By: Relevance
“…Also note that both SEDC and CFproto are not limited to tabular data. For other applications such as explaining images, these algorithms can also be useful [40,22].…”
Section: Explanation Requirementsmentioning
confidence: 99%
“…Also note that both SEDC and CFproto are not limited to tabular data. For other applications such as explaining images, these algorithms can also be useful [40,22].…”
Section: Explanation Requirementsmentioning
confidence: 99%
“…Counterfactuals provide explanations for chosen decisions by describing what changes on the input would lead to an alternative prediction while minimizing the magnitude of the changes to preserve the fidelity, which is identical to the process of generating adversarial examples [6]. Unfortunately, owing to the multidimensional geometric information that is unacceptable to the human brain, existing image-oriented approaches addressed the counterfactual explanations only at the semantic level [11,32].…”
Section: Towards Explainable Pc Modelsmentioning
confidence: 99%
“…For example, lips, eyes, and maybe cheeks would change if makeup was applied to a face but not hair color or the background. Recently, many methods for generating counterfactual examples have been proposed [1,6,7,10,11,13,23,36,37,40]. A common drawback for all the methods is that they need to query the model under consideration many times.…”
Section: Introductionmentioning
confidence: 99%