2022
DOI: 10.1007/s10489-022-04201-z
|View full text |Cite
|
Sign up to set email alerts
|

Opposition-based sine cosine optimizer utilizing refraction learning and variable neighborhood search for feature selection

Abstract: This paper proposes new improved binary versions of the Sine Cosine Algorithm (SCA) for the Feature Selection (FS) problem. FS is an essential machine learning and data mining task of choosing a subset of highly discriminating features from noisy, irrelevant, high-dimensional, and redundant features to best represent a dataset. SCA is a recent metaheuristic algorithm established to emulate a model based on sine and cosine trigonometric functions. It was initially proposed to tackle problems in the continuous d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 26 publications
(13 citation statements)
references
References 123 publications
0
2
0
Order By: Relevance
“…To validate the superiority of the proposed method, some state-of-the-art wrapper algorithms, including CMIM [29], JMI [30], DISR [31], mRMR [32], Relax-mRMR [33], IBSCA3 [34], and EOSSA [35] and some traditional filter algorithms, including Gini index, Fisher score, FS-OLS [36], ReliefF [37], and Kruskal-Wallis [38] are used for comparison. The LS-SVM algorithm is used to test the dataset after feature selection.…”
Section: Comparison Between the Proposed Methods And Some Typical Fea...mentioning
confidence: 99%
“…To validate the superiority of the proposed method, some state-of-the-art wrapper algorithms, including CMIM [29], JMI [30], DISR [31], mRMR [32], Relax-mRMR [33], IBSCA3 [34], and EOSSA [35] and some traditional filter algorithms, including Gini index, Fisher score, FS-OLS [36], ReliefF [37], and Kruskal-Wallis [38] are used for comparison. The LS-SVM algorithm is used to test the dataset after feature selection.…”
Section: Comparison Between the Proposed Methods And Some Typical Fea...mentioning
confidence: 99%
“… Reference no. Optimization algorithm used Application Özbay 4 Modified seahorse optimization algorithm Engineering design problems Gharehchopogh and Ibrikci 6 Improved African vultures optimization algorithm Multi-level thresholding image segmentation Krishna et al 11 K-means and PSO algorithm Clustering Eluri and Devarakonda 12 Chaotic binary pelican optimization algorithm Feature selection Eluri and Devarakonda 14 Binary flamingo search algorithm and genetic algorithm Feature selection Abed-Alguni et al 15 Opposition-based sine cosine optimizer Feature selection Abed-Alguni et al 16 Improved binary djaya algorithm Feature selection Gharehchopogh et al 17 Dynamic harris hawk optimization algorithm Botnet detection in IoT Cheng and Prayogo 26 Symbiotic organisms search optimization algorithm Engineering design problems Kaur et al 27 Sandpaper optimization algorithm Engineering design problems Mohammadi-Balani et al 29 Golden eagle optimizer Engineering design problems Saremi et al 30 Grasshopper optimization algorithm Structural design problems Trojovská et al …”
Section: Basics Of Nature-inspired Algorithmsmentioning
confidence: 99%
“…Termed alternatively as lifelong or incremental learning, continual learning stands at the core of crafting AI systems that are adept in navigating the complexities of dynamic and evolving operational landscapes [3]. These landscapes are marked by variable data distributions, the advent of novel tasks, and the gradual obsolescence of older tasks, compelling a model's necessity to learn and adapt continuously without necessitating a reinitialization at every juncture of change [4][5][6]. The exposition delineates the criticality of continual learning in surmounting the constraints posed by static learning frameworks, particularly highlighting the phenomenon of catastrophic forgetting, where the assimilation of new information could inadvertently obliterate previously acquired knowledge [7].…”
Section: Introductionmentioning
confidence: 99%