2022
DOI: 10.53730/ijhs.v6ns1.6667
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid slime mould - Grey wolf optimization algorithm for efficient feature selection

Abstract: Selection of features is an effective method for minimizing the amount of data features in order to improve machine learning classification performance. Choosing a set of attributes is a high-level procedure for selecting a collection of relevant features. To boost the classifier's performance, use a dimensional dataset. We outline a typical feature selection problem in this work in order to minimize the amount of role and responsibilities while improving accuracy. Different classification dataset from the Mac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 0 publications
0
1
0
Order By: Relevance
“…4.2.9. Others Moreover, researchers have hybridized an SMA with a sine cosine algorithm [83], marine predators algorithm [85], particle swarm optimization [97], evolutionary algorithm [98], firefly algorithm [99], gray wolf optimization algorithm [100], gradient-based optimizer [101], quadratic approximation [102], tournament selection [103], artificial neural network [104], moth-flame optimization algorithm [105], pattern search algorithm [106], and support vector regression [107]. These hybrid SMA variants indicated their benefits, such as the good balance between exploration and exploitation capabilities, good convergence speed, ability to avoid premature convergence, and reduced computation time.…”
Section: Hybridization With the Artificial Bee Colony (Abc)mentioning
confidence: 99%
See 1 more Smart Citation
“…4.2.9. Others Moreover, researchers have hybridized an SMA with a sine cosine algorithm [83], marine predators algorithm [85], particle swarm optimization [97], evolutionary algorithm [98], firefly algorithm [99], gray wolf optimization algorithm [100], gradient-based optimizer [101], quadratic approximation [102], tournament selection [103], artificial neural network [104], moth-flame optimization algorithm [105], pattern search algorithm [106], and support vector regression [107]. These hybrid SMA variants indicated their benefits, such as the good balance between exploration and exploitation capabilities, good convergence speed, ability to avoid premature convergence, and reduced computation time.…”
Section: Hybridization With the Artificial Bee Colony (Abc)mentioning
confidence: 99%
“…Anji reddy Vaka et al [91] proposed a hybrid WOA-SMA and applied it to BreakHis and IDC datasets to evaluate breast cancer classifications. Khan AA et al [100] presented a hybrid of an SMA with GWO for feature selection purpose and evaluated it in UCI repository datasets by comparing it to other algorithms. Ewees AA et al [101] also evaluated the performance of the proposed GBOSMA in several benchmark datasets to solve feature selection problems, which showed that the GBOSMA superseded the other models.…”
Section: Feature Selection (Fs)mentioning
confidence: 99%
“…Sayed GI et al [154] introduced a pistachio species classification method on the basis of SMA. Khan AA et al [156]presented a hybrid of SMA with GWO for feature selection and evaluated it in UCI repository datasets by comparing with other algorithms. Wei X et al [157] proposed a SMA-VMD-WTD model to identify and eliminate the transient electromagnetic signal noise.…”
Section: Machine Learningmentioning
confidence: 99%