2021
DOI: 10.3389/fphys.2021.657304
|View full text |Cite
|
Sign up to set email alerts
|

An Interpretable Hand-Crafted Feature-Based Model for Atrial Fibrillation Detection

Abstract: Atrial Fibrillation (AF) is the most common type of cardiac arrhythmia. Early diagnosis of AF helps to improve therapy and prognosis. Machine Learning (ML) has been successfully applied to improve the effectiveness of Computer-Aided Diagnosis (CADx) systems for AF detection. Presenting an explanation for the decision made by an ML model is considerable from the cardiologists' point of view, which decreases the complexity of the ML model and can provide tangible information in their diagnosis. In this paper, a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0
2

Year Published

2021
2021
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 19 publications
(11 citation statements)
references
References 54 publications
0
9
0
2
Order By: Relevance
“…Many studies involving machine learning algorithms to identify atrial fibrillation from single-lead or ambulatory ECGs have been performed 58–76 . As an extensive database of single-lead ECGs is easily accessible, many machine learning models have been created compared with 12-lead ECGs.…”
Section: Artificial Intelligence In Cardiovascular Preventionmentioning
confidence: 99%
See 1 more Smart Citation
“…Many studies involving machine learning algorithms to identify atrial fibrillation from single-lead or ambulatory ECGs have been performed 58–76 . As an extensive database of single-lead ECGs is easily accessible, many machine learning models have been created compared with 12-lead ECGs.…”
Section: Artificial Intelligence In Cardiovascular Preventionmentioning
confidence: 99%
“…Many studies involving machine learning algorithms to identify atrial fibrillation from single-lead or ambulatory ECGs have been performed. [58][59][60][61][62][63][64][65][66][67][68][69][70][71][72][73][74][75][76] As an extensive database of single-lead ECGs is easily accessible, many machine learning models have been created compared with 12-lead ECGs. Overall, accuracy, sensitivity and specificity tended to be above 0.85.…”
Section: Atrial Fibrillationmentioning
confidence: 99%
“…Random forest algorithms are a popular approach in cardiac EP and have been used in studies ranging from predicting left atrial appendage flow velocity to predicting 30-day mortality post ST-elevation myocardial infarction. [47][48][49][50][51][52][53][54][55] Random forest models were useful in these studies because a decision path to the correct prediction could be found in the majority of distinct trees. Random forest models have been shown to be particularly useful in providing interpretability due to each split being a defined threshold that offers insight.…”
Section: Random Forestmentioning
confidence: 99%
“…55 Rouhi et al showed that through extracting custom features from ECGs, a random forest model was able to detect AF. 48 Pičulin et al were able to predict the progression of hypertrophic cardiomyopathy 10 years out using a random forest model. 52 51 Random forest was able to predict well on this data set because it can split the multimodal data many ways to inform its decision and incorporate the correlation between the different data types.…”
Section: Random Forestmentioning
confidence: 99%
“…Furthermore, attempts to reveal the importance of the input feature map have been presented with explanatory focus to the decision mechanisms in different machine learning classifiers. For example, the feature importance is estimated by the Gini index in a random forest classifier [ 31 ]; ‘weight’ method for Extreme Gradient Boosting classifier [ 31 ]; the bagged decision tree ensemble [ 29 ]; Gain index for decision trees [ 37 ]; minimum redundancy, maximum relevance and minimum redundancy maximum relevance justified with a decision trees ensemble [ 32 ]; minimum redundancy maximum relevance, decision tree and SVM ranking methods [ 33 ]; and logistic regression, permutation testing, random forest and SHapley Additive exPlanations (SHAP) [ 50 ]. The latter has shown that each feature importance technique results in different feature rankings, depending on their characteristics and assumptions.…”
Section: Introductionmentioning
confidence: 99%