2023 International Conference on Intelligent Data Communication Technologies and Internet of Things (IDCIoT) 2023
DOI: 10.1109/idciot56793.2023.10053458
|View full text |Cite
|
Sign up to set email alerts
|

Predicting Heart Failure using SMOTE-ENN-XGBoost

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 25 publications
0
2
0
Order By: Relevance
“…It is an ensemble learning technique that builds a powerful predictive model by combining several weak predictive models, often decision trees, commonly used to predict heart diseases [ 40 , 41 ]. The XGBoost approach is a development of gradient boosting, adding decision trees to the model iteratively while each tree tries to fix the mistakes caused by the preceding one.…”
Section: Methodsmentioning
confidence: 99%
“…It is an ensemble learning technique that builds a powerful predictive model by combining several weak predictive models, often decision trees, commonly used to predict heart diseases [ 40 , 41 ]. The XGBoost approach is a development of gradient boosting, adding decision trees to the model iteratively while each tree tries to fix the mistakes caused by the preceding one.…”
Section: Methodsmentioning
confidence: 99%
“…However, similar to SMOTE, adaptative synthetic sampling solutions cannot efficiently target the within-class imbalance issue. Data cleaning solutions such as SMOTE-Tomeklink (STL, Swana et al (2022)) and SMOTE with Wilson's edited nearest neighbor rule (SENN, Parthasarathy et al (2023)) improve the quality of augmented data with a post-processing mechanism that would remove noisy, ambiguous or wrongly located samples. It is important to note that STL and SENN are extensions of the Tomeklink (TL, Tomek (1976)) and the Wilson's edited nearest neighbor rule (ENN, Wilson (1972)) undersampling approaches.…”
Section: Synthetic Samplingmentioning
confidence: 99%