2020 International Conference on Computer Communication and Informatics (ICCCI) 2020
DOI: 10.1109/iccci48352.2020.9104137
|View full text |Cite
|
Sign up to set email alerts
|

An Ensemble Feature Selection Method for Prediction of CKD

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(9 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…Figure 4 illustrates the feature ranking algorithm of the RFFS methodology based on feature importance and feature value. Table 2 summarizes the ranking of the features using the proposed RFFS algorithm, density‐based feature selection (DFS) method (Manonmani & Balakrishnan, n.d.), and mixed data feature selection (MDFS) model.…”
Section: Proposed Rffs Methodologymentioning
confidence: 99%
See 2 more Smart Citations
“…Figure 4 illustrates the feature ranking algorithm of the RFFS methodology based on feature importance and feature value. Table 2 summarizes the ranking of the features using the proposed RFFS algorithm, density‐based feature selection (DFS) method (Manonmani & Balakrishnan, n.d.), and mixed data feature selection (MDFS) model.…”
Section: Proposed Rffs Methodologymentioning
confidence: 99%
“…The LR classification strategy was shown to be the most precise in this function, including precision of roughly 97% in this analysis. Manonmani and Balakrishnan (2020) presented an ensemble feature selection method for diagnosing chronic illnesses that integrates the output of filter and wrapper techniques to provide the most discriminating characteristics. The CKD dataset is subjected to the ensembles feature extraction technique in this article.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…SVM achieved a high classification accuracy of 93%, GB achieved a high classification accuracy of 97%, and CNN achieved a high classification accuracy of 97.75% for the derived optimal feature set. The proposed work achieved a feature reduction of 62.5% for the eight features selected using the SVM and CNN classification algorithms and 66.6% for the nine features selected using the GB classification algorithm [5].…”
Section: Related Workmentioning
confidence: 99%
“…13, No. 2, June 2024: 1489-1498 1490 [3]. Common wrapper methods include: Forward selection, backward elimination and recursive feature elimination (RFE).…”
Section: Introductionmentioning
confidence: 99%