2022
DOI: 10.1016/j.jksuci.2020.12.012
|View full text |Cite
|
Sign up to set email alerts
|

Evolutionary computing for clinical dataset classification using a novel feature selection algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 17 publications
0
8
0
Order By: Relevance
“…Kiziloz et al (2018) considered feature selection as a multiobjective optimization problem and proposed novel variants of multiobjective Teaching‐Learning‐Based Optimization for the feature selection task. Sheth et al (2020) proposed a multiobjective Jaya Optimization Algorithm for obtaining the optimal subset of features. The proposed model was applied to many clinical datasets.…”
Section: Related Workmentioning
confidence: 99%
“…Kiziloz et al (2018) considered feature selection as a multiobjective optimization problem and proposed novel variants of multiobjective Teaching‐Learning‐Based Optimization for the feature selection task. Sheth et al (2020) proposed a multiobjective Jaya Optimization Algorithm for obtaining the optimal subset of features. The proposed model was applied to many clinical datasets.…”
Section: Related Workmentioning
confidence: 99%
“…Single objective wrapper techniques generally serve the purpose by restricting the length of the feature set, or by enhancing the classification efficiency, or by aggregating these targets [41]. For a better understanding of the single-objective evolutionary methods for solving FS task the interested reader can refer to DA [21], SSA [26], [42], HS [43], TLBO [28], grasshopper optimization [44], Jaya algorithm [22], [45], HHO [18], atom search [46], SMO [24], SHO [25], CS [47], ALO [48], ABC [30], FOA [49], FPA [50], and WOA [51], [52] etc.…”
Section: Related Workmentioning
confidence: 99%
“…Figure 2 shows the possible numbers of selected features that have around half the number of available features with greater chance being selected than others. See algorithm 3 (Line [15][16][17][18][19][20][21][22].…”
Section: B New Initialization Mechanismmentioning
confidence: 99%
“…FS is an optimization problem [16]. One of the most efficient popular methods used to generate candidate solutions for optimization problems is Nature Inspired Algorithms (NIAs) algorithms [17]- [19]. NIAs, developed using characteristics of biological systems, are a population meta-heuristic class of algorithms that improves concurrency of computing multiple candidate solutions [20]- [22].…”
Section: Introductionmentioning
confidence: 99%