2020
DOI: 10.1038/s41598-020-62724-2
|View full text |Cite
|
Sign up to set email alerts
|

Resolving challenges in deep learning-based analyses of histopathological images using explanation methods

Abstract: Deep learning has recently gained popularity in digital pathology due to its high prediction quality. However, the medical domain requires explanation and insight for a better understanding beyond standard quantitative performance evaluation. Recently, explanation methods have emerged, which are so far still rarely used in medicine. This work shows their application to generate heatmaps that allow to resolve common challenges encountered in deep learning-based digital histopathology analyses. These challenges … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
125
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 135 publications
(134 citation statements)
references
References 44 publications
(76 reference statements)
0
125
0
Order By: Relevance
“…In fact, it can be applied to improve the understanding of any machine learning algorithm that learns from ambiguous ground truth data. For example, T-REX could be used in the application of uncovering biases of ML prediction models in digital histopathology not only with respect to data set biases but also with respect to varying opinions of experts labeling the histopathology images 80 . In applications, where supervised ML decision models are trained to detect diseases such as Covid-19 (ref.…”
Section: Discussionmentioning
confidence: 99%
“…In fact, it can be applied to improve the understanding of any machine learning algorithm that learns from ambiguous ground truth data. For example, T-REX could be used in the application of uncovering biases of ML prediction models in digital histopathology not only with respect to data set biases but also with respect to varying opinions of experts labeling the histopathology images 80 . In applications, where supervised ML decision models are trained to detect diseases such as Covid-19 (ref.…”
Section: Discussionmentioning
confidence: 99%
“…This, in turn, could affect the demand: if sufficient trust is not provided, customers may opt not to use these models. In general, this attitude of the demand side of the market incentivizes organizations to optimally fund research to develop interpretation techniques with the aim of identifying and removing biases and generating trust [62]. Consequently, the incentives of all the involved stakeholders are expected to be aligned and lead to the maximization of aggregate social welfare.…”
Section: Discussion and Proposed Solutionsmentioning
confidence: 99%
“…In addition to purely focusing on prediction accuracy, the authors applied LRP in order to analyze the nonlinear properties of the learning machine by mapping the results of a prediction onto a heatmap that reveals the morphological particularities of the studied pathological properties. Hägele et al (2019) analyzed histopathological images and applied LRP for visual and quantitative verification of features used for prediction as well as for detection of various latent but crucial biases using heatmapping. Out of such explanations and visualizations, experts might get valuable interpretations, but to even improve interpretability especially for lay humans it could be helpful to include other explanation modalities.…”
Section: Explanation Generation and Visual Analyticsmentioning
confidence: 99%