2021 IEEE 21st International Conference on Bioinformatics and Bioengineering (BIBE) 2021
DOI: 10.1109/bibe52308.2021.9635541
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Local Ablation Approach for Explaining Multimodal Classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1
1

Relationship

4
3

Authors

Journals

citations
Cited by 13 publications
(13 citation statements)
references
References 9 publications
0
13
0
Order By: Relevance
“…Some methods have been developed that could be used for estimating the degree of confidence in an explanation. These methods chiefly involve repeatedly perturbing data samples and examining the effect of the perturbations on model performance or classification probabilities [1], [2]. Unfortunately, the utility of perturbation approaches can be reduced by high dimensional data spaces [3], and perturbation approaches can produce out-of-distribution samples that make explanations unreliable [4].…”
Section: Introductionmentioning
confidence: 99%
“…Some methods have been developed that could be used for estimating the degree of confidence in an explanation. These methods chiefly involve repeatedly perturbing data samples and examining the effect of the perturbations on model performance or classification probabilities [1], [2]. Unfortunately, the utility of perturbation approaches can be reduced by high dimensional data spaces [3], and perturbation approaches can produce out-of-distribution samples that make explanations unreliable [4].…”
Section: Introductionmentioning
confidence: 99%
“…As a result, most studies have not used explainability ( Zhang et al, 2011 ; Kwon et al, 2018 ; Niroshana et al, 2019 ; Phan et al, 2019 ; Wang et al, 2020 ; Li et al, 2021 ), which is concerning because transparency is increasingly required to assist with model development and physician decision making ( Sullivan and Schweikart, 2019 ). As such, more multimodal explainability methods need to be developed ( Lin et al, 2019 ; Mellem et al, 2020 ; Ellis et al, 2021a , b , c , d ). In this study, we use automated sleep stage classification as a testbed for the development of multimodal explainability methods.…”
Section: Introductionmentioning
confidence: 99%
“…Because of this, they can also be analyzed on a subject-specific level that paves the way for the identification of personalized biomarkers. Furthermore, local explanations can be used to examine the degree to which demographic and clinical variables affect the patterns learned by a classifier for specific classes and features ( Ellis et al, 2021c ), which is a capacity that has not previously been exploited in multimodal classification. Local methods have been applied in a couple multimodal classification studies.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…While this application can improve model performance, applying traditional explainability methods to raw time-series samples makes it very difficult to know what time or frequency features are extracted by classifiers and to draw global conclusions about the importance of extracted features ( Sturm et al, 2016 ). It should be noted that this difficulty is not applicable to identifying spatial importance ( Sturm et al, 2016 ) or modality importance ( Ellis et al, 2021a , b , f , 2022 ), in multichannel or multimodal classification, respectively. However, this difficultyis applicable when trying to understand the temporal and spectral features extracted by classifiers.…”
Section: Introductionmentioning
confidence: 99%