2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) 2021
DOI: 10.1109/cvprw53098.2021.00199
|View full text |Cite
|
Sign up to set email alerts
|

Towards Domain-Specific Explainable AI: Model Interpretation of a Skin Image Classifier using a Human Approach

Abstract: Machine Learning models have started to outperform medical experts in some classification tasks. Meanwhile, the question of how these classifiers produce certain results is attracting increasing research attention. Current interpretation methods provide a good starting point in investigating such questions, but they still massively lack the relation to the problem domain. In this work, we present how explanations of an AI system for skin image analysis can be made more domain-specific. We apply the synthesis o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(9 citation statements)
references
References 23 publications
0
7
0
Order By: Relevance
“…A DNN was used as skin image classifier and LIME with the ABCDrule for model explainability [212]. The ABCD-rule of dermoscopy is known to outperform other methods [212].…”
Section: Optical Imaging 451 Dermatologymentioning
confidence: 99%
See 2 more Smart Citations
“…A DNN was used as skin image classifier and LIME with the ABCDrule for model explainability [212]. The ABCD-rule of dermoscopy is known to outperform other methods [212].…”
Section: Optical Imaging 451 Dermatologymentioning
confidence: 99%
“…A DNN was used as skin image classifier and LIME with the ABCDrule for model explainability [212]. The ABCD-rule of dermoscopy is known to outperform other methods [212]. For explainability, the LIME model was fused with a previously introduced human medical algorithm by the authors [212].…”
Section: Optical Imaging 451 Dermatologymentioning
confidence: 99%
See 1 more Smart Citation
“…Based on (Stieler et al, 2021)'s work, it can be presumed that changes such as rotation and shift of the skin lesion should not change the prediction of the model. On the other hand, changing the color and boundaries may change the features of the lesion.…”
Section: Relevance Of Image Features To Correctness Of Predictionmentioning
confidence: 99%
“…The authors of the paper [33] proposed a toolbox called "Neuroscope" that addresses the issue by providing state-of-the-art visualization algorithms as well as freshly modified methods for semantic segmentation of CNNs. One of the approaches [34] proposed a domain-specific explanation system for skin image analysis. The authors applied Local Interpretable Model-Agnostic Explanations (LIME) and presented the results using a Deep Neural Network (DNN) based image classifier, but for the field of deepfake, LIME is still unused.…”
Section: Introductionmentioning
confidence: 99%