2020
DOI: 10.1016/j.sciaf.2020.e00456
|View full text |Cite
|
Sign up to set email alerts
|

Development of an ensemble approach to chronic kidney disease diagnosis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 33 publications
(18 citation statements)
references
References 4 publications
0
18
0
Order By: Relevance
“…In our work, the Relief-F feature selection methods have achieved the best performance for the testing results and cross-validation results using DT and GBT Classifier compared to the other existing works [ 23 , 24 , 26 , 27 , 30 ]. Also, our work is different from the other existing works [ 22 , 25 ] because it registered the results for both the training set and the testing set, and it has achieved the best performance.…”
Section: Experiments and Resultsmentioning
confidence: 85%
See 1 more Smart Citation
“…In our work, the Relief-F feature selection methods have achieved the best performance for the testing results and cross-validation results using DT and GBT Classifier compared to the other existing works [ 23 , 24 , 26 , 27 , 30 ]. Also, our work is different from the other existing works [ 22 , 25 ] because it registered the results for both the training set and the testing set, and it has achieved the best performance.…”
Section: Experiments and Resultsmentioning
confidence: 85%
“…The proposed CFS with AdaBoost achieved the best performance at 98.1% accuracy. In [ 25 ], the authors used two ensembles techniques which are Bagging and Random Subspace methods and three base-learners, KNN, NB, and DT, to predict CKD. The random subspace has achieved the best performance than Bagging on KNN classifier.…”
Section: Related Workmentioning
confidence: 99%
“…The result obtained established the accuracy levels of decision tree and ensemble classifier as 95% and 98.8% respectively. This is an improvement on [9] with an accuracy level of 89.2% and 95% for decision tree and ensemble classifier respectively. The result obtained in this research work is also better than that of [4] where the decision tree and ensemble classifier accuracy values obtained are 92% and 94.25% respectively.…”
Section: Resultsmentioning
confidence: 94%
“…Set of two ensemble models such as Bagging and Random Subspace methods on three base-learners k Nearest Neighbours, Naïve Bayes and Decision Tree has been presented in [24] to improve the classifier outcome. The presented model involves data preprocessing for handling the missing values and data scaling for the normalization of the range of independent variables.…”
Section: Related Workmentioning
confidence: 99%
“…The FS issue for CKD is assumed to be a combinatorial optimization problem. The count of viable feature integration from a feature set which has 24 features that are computed as 2 24 -1. It refers that, it is not possible to process the estimation of every feature combinations.…”
Section: Optimal Feature Subset Selection Problemmentioning
confidence: 99%