2020
DOI: 10.1007/s00234-020-02465-1
|View full text |Cite
|
Sign up to set email alerts
|

Implementation of model explainability for a basic brain tumor detection using convolutional neural networks on MRI slices

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
4

Relationship

1
9

Authors

Journals

citations
Cited by 49 publications
(17 citation statements)
references
References 9 publications
0
12
0
Order By: Relevance
“…The use of Grad-CAM in this study provided important information and led us to evaluate the performance of the model on non-contrast-enhanced slices. Explainable AI has seen increased use in machine learning in general as well as machine learning in medicine in particular and has been shown to benefit the development of models in various ways [30]. Furthermore, understanding the decisions of a model is important to enable the adoption of a model by clinicians as they are less likely to use something that is deemed a "black box" [31].…”
Section: Discussionmentioning
confidence: 99%
“…The use of Grad-CAM in this study provided important information and led us to evaluate the performance of the model on non-contrast-enhanced slices. Explainable AI has seen increased use in machine learning in general as well as machine learning in medicine in particular and has been shown to benefit the development of models in various ways [30]. Furthermore, understanding the decisions of a model is important to enable the adoption of a model by clinicians as they are less likely to use something that is deemed a "black box" [31].…”
Section: Discussionmentioning
confidence: 99%
“…The interpretability was evaluated with Grad-CAM and Guided Backpropagation for validating the trustworthiness [150]. A pre-trained ResNet50 model was used on IXI dataset of MRI slices for a multiclass classification into a glioblastoma, vestibular schwannoma, or no tumor [152]. Grad-CAM was used for visualisations and helped to identify the tumor location.…”
Section: Magnetic Resonance Imaging (Mri)mentioning
confidence: 99%
“…Several XAI methods have been previously proposed for natural image tasks, while little attention has been paid to explain brain imaging applications [ 18 ]. For brain cancer classification, Windisch et al [ 28 ] applied 2D Grad-CAM to generate heatmaps indicating which areas of the input MRI made the classifier decide on the category of the existence of a brain tumor. Similarly, 2D Grad-CAM was used in [ 29 ] to evaluate the performance of three DL models in brain tumor classification.…”
Section: Related Workmentioning
confidence: 99%