2022
DOI: 10.1093/ehjdh/ztac038
|View full text |Cite
|
Sign up to set email alerts
|

Improving explainability of deep neural network-based electrocardiogram interpretation using variational auto-encoders

Abstract: Aims Deep neural networks (DNNs) perform excellently in interpreting electrocardiograms (ECGs), both for conventional ECG interpretation and for novel applications such as detection of reduced ejection fraction (EF). Despite these promising developments, implementation is hampered by the lack of trustworthy techniques to explain the algorithms to clinicians. Especially, currently employed heatmap-based methods have shown to be inaccurate. Meth… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
17
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 25 publications
(20 citation statements)
references
References 30 publications
(51 reference statements)
1
17
0
Order By: Relevance
“…FactorECG improves upon heatmap-based attempts to make deep learning explainable, as such approaches merely highlight ‘where’ on the ECG significant features are detected but provide no information on which morphological change explains the prediction. 16 Rather, FactorECG allows for ‘quantifiable’ identification of specific ECG features, rendering physicians able to evaluate and confirm the clinical rationale of said features. This is reflected by our results that confirm the known importance of LBBB morphology and QRS AREA for the prediction of echocardiographic response.…”
Section: Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…FactorECG improves upon heatmap-based attempts to make deep learning explainable, as such approaches merely highlight ‘where’ on the ECG significant features are detected but provide no information on which morphological change explains the prediction. 16 Rather, FactorECG allows for ‘quantifiable’ identification of specific ECG features, rendering physicians able to evaluate and confirm the clinical rationale of said features. This is reflected by our results that confirm the known importance of LBBB morphology and QRS AREA for the prediction of echocardiographic response.…”
Section: Discussionmentioning
confidence: 99%
“…Many clinicians regard deep learning as a ‘black box’, which limits trust in such algorithms. 16 However, our approach to make the model inherently explainable may abate this concern and increase willingness to facilitate clinical adoption of the FactorECG. Although an overall c-statistic of 0.69 leaves room for improvement, our approach is unique in its clinical practicality, with better risk stratification than QRS AREA .…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations