2021
DOI: 10.1101/2021.06.10.447986
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Novel Local Ablation Approach for Explaining Multimodal Classifiers

Abstract: With the growing use of multimodal data for deep learning classification in healthcare research, more studies have begun to present explainability methods for insight into multimodal classifiers. Among these studies, few have utilized local explainability methods, which could provide (1) insight into the classification of each sample and (2) an opportunity to better understand the effects of latent variables within datasets (e.g., medication of subjects in electrophysiology data). To the best of our knowledge,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1

Relationship

5
1

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…We introduce a global ablation approach that is uniquely adapted for the electrophysiology domain (Ellis et al, 2021b). We then present a local ablation approach (Ellis et al, 2021c) and show how GBFA methods can be used for local insight into multimodal classifiers (Ellis et al, 2021a). With our local methods, we identify subjectlevel differences in modality importance that support the viability of the methods for personalized biomarker identification.…”
Section: Limitations Of Global Explanations and Proposal Of Novel Loc...mentioning
confidence: 95%
See 2 more Smart Citations
“…We introduce a global ablation approach that is uniquely adapted for the electrophysiology domain (Ellis et al, 2021b). We then present a local ablation approach (Ellis et al, 2021c) and show how GBFA methods can be used for local insight into multimodal classifiers (Ellis et al, 2021a). With our local methods, we identify subjectlevel differences in modality importance that support the viability of the methods for personalized biomarker identification.…”
Section: Limitations Of Global Explanations and Proposal Of Novel Loc...mentioning
confidence: 95%
“…Because of this, they can also be analyzed on a subject-specific level that paves the way for the identification of personalized biomarkers. Furthermore, local explanations can be used to examine the degree to which demographic and clinical variables affect the patterns learned by a classifier for specific classes and features (Ellis et al, 2021c), which is a capacity that has not previously been exploited in multimodal classification. Local methods have been applied in a couple multimodal classification studies.…”
Section: Limitations Of Global Explanations and Proposal Of Novel Loc...mentioning
confidence: 99%
See 1 more Smart Citation
“…While this application can improve model performance, applying traditional explainability methods to raw time-series samples makes it very difficult to know what time or frequency features are extracted by classifiers and to draw global conclusions about the importance of extracted features (9). It should be noted that this difficulty is not applicable to identifying spatial importance (10) or modality importance (11)(12)(13)(14) in multichannel or multimodal classification, respectively. However, this difficulty is applicable when trying to understand the temporal and spectral features extracted by classifiers.…”
Section: Introductionmentioning
confidence: 99%