2022
DOI: 10.1016/j.compbiomed.2021.105137
|View full text |Cite
|
Sign up to set email alerts
|

Evolving kernel extreme learning machine for medical diagnosis via a disperse foraging sine cosine algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8

Relationship

5
3

Authors

Journals

citations
Cited by 71 publications
(29 citation statements)
references
References 120 publications
0
23
0
Order By: Relevance
“…Table 3 records the combined results of IHGS at 10D, 30D, 50D, and 100D based on the Friedman test and Wilcoxon singed-rank test with a significance level of 0.05. From the perspective of symbols of "+/À/=", among the 30 examples in lower dimensions (10D), the IHGS algorithm has 20,8,17,20,7,29,22,1,21,9,18,4,17,29 examples outperforming OBSCA, EB_LSHADE, SCADE, CBA, EAGDE, CESCA, AMFOA, EBOW, RCBA, JSO, HGWO, LSHADESPACMA, BMWOA, MSFOA, and there are 7, 21, 7, 5, 21, 0, 8, 28, 5, 18, 10, 19, 9, 1 examples weaker than them, while there are 3, 1, 6, 5, 2, 1, 0, 1, 4, 3, 2, 7, 4 examples with the same performance as them. As the problem's dimensionality increases, IHGS is more advantageous in handling high-dimensional problems.…”
Section: Detailed Results For the Ieee Cec 2017 Function Setmentioning
confidence: 99%
“…Table 3 records the combined results of IHGS at 10D, 30D, 50D, and 100D based on the Friedman test and Wilcoxon singed-rank test with a significance level of 0.05. From the perspective of symbols of "+/À/=", among the 30 examples in lower dimensions (10D), the IHGS algorithm has 20,8,17,20,7,29,22,1,21,9,18,4,17,29 examples outperforming OBSCA, EB_LSHADE, SCADE, CBA, EAGDE, CESCA, AMFOA, EBOW, RCBA, JSO, HGWO, LSHADESPACMA, BMWOA, MSFOA, and there are 7, 21, 7, 5, 21, 0, 8, 28, 5, 18, 10, 19, 9, 1 examples weaker than them, while there are 3, 1, 6, 5, 2, 1, 0, 1, 4, 3, 2, 7, 4 examples with the same performance as them. As the problem's dimensionality increases, IHGS is more advantageous in handling high-dimensional problems.…”
Section: Detailed Results For the Ieee Cec 2017 Function Setmentioning
confidence: 99%
“…According to the study, the performance of the classifier is affected greatly by its inner parameters and the features in the data. Metaheuristics have great effectiveness in solving this type of problem as shown in many works [28][29][30][31][32], such as object tracking [33,34], the traveling salesman problem [35], gate resource allocation [36,37], multi-attribute decisionmaking [38,39], the design of the power electronic circuit [40,41], fractional-order controllers [42], medical diagnoses [43,44], big data optimization problems [45], green supplier selections [46], economic emission dispatch problems [47], scheduling problems [48,49], and combination optimization problems [50]. This study proposes an enhanced crow search algorithm (CSA) [51] to simultaneously optimize the hyperparameters of the kernel extreme learning machine (KELM) and the feature space for predicting the entrepreneurial intention of college students.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Specifically, average result (Avg) and standard deviation (Std) represent the average prediction result and standard deviation of each model under ten experiments, respectively. To find the best model, we evaluated each model using 10-fold cross-validation, which was adopted in many studies [43,44,113]. Moreover, the prediction results were analyzed by using the common metrics [114][115][116], including accuracy (ACC), sensitivity (Sens), specificity (Spec), and the Matthews correlation coefficient (MCC).…”
Section: Condition Configurationmentioning
confidence: 99%
“…As a result, more and more researchers have started introducing swarm intelligence algorithm (SIOA) into the traditional MTIS to improve the segmentation efficiency instead of the traditional exhaustive method. These SIOAs has offered greater efficiency in optimization tasks such as expensive optimization problems [ 32 , 33 ], medical diagnosis [ [34] , [35] , [36] , [37] ], PID optimization control [ [38] , [39] , [40] ], plant disease recognition [ 41 ], feature selection [ [42] , [43] , [44] , [45] ], object tracking [ 46 , 47 ], economic emission dispatch problem [ 48 ], engineering design [ [49] , [50] , [51] ], parameter tuning for machine learning models [ [52] , [53] , [54] ], constrained optimization problems [ 55 , 56 ], combination optimization problems [ 57 ], traveling salesman problem [ 58 ], multi-objective or many optimization problems [ [59] , [60] , [61] ], and scheduling problems [ [62] , [63] , [64] ].…”
Section: Introductionmentioning
confidence: 99%