2023
DOI: 10.1016/j.asoc.2023.110583
|View full text |Cite
|
Sign up to set email alerts
|

BE-GWO: Binary extremum-based grey wolf optimizer for discrete optimization problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 14 publications
(5 citation statements)
references
References 65 publications
0
5
0
Order By: Relevance
“…Transfer functions are among the most widely used techniques because of their simplicity as an operator in the binary transformation [48]. Various transfer methods have been introduced from continuous search space to binary space, such as v-shaped [49], s-shaped [50], o-shaped [51], zshaped [52], x-shaped [53], cosine transfer function [54], and hyperbolic tangent transfer function [55]. In this study, the hyperbolic tangent transfer function is used to convert the continuous search space to a binary one since it has some advantages, such as a few control parameters, an efective convergence rate, and the simplicity of implementation.…”
Section: Binary Transformation Of the Original Fbimentioning
confidence: 99%
“…Transfer functions are among the most widely used techniques because of their simplicity as an operator in the binary transformation [48]. Various transfer methods have been introduced from continuous search space to binary space, such as v-shaped [49], s-shaped [50], o-shaped [51], zshaped [52], x-shaped [53], cosine transfer function [54], and hyperbolic tangent transfer function [55]. In this study, the hyperbolic tangent transfer function is used to convert the continuous search space to a binary one since it has some advantages, such as a few control parameters, an efective convergence rate, and the simplicity of implementation.…”
Section: Binary Transformation Of the Original Fbimentioning
confidence: 99%
“…Wrapper methods, on the other hand, use a classification model to create all subsets and corresponding classification models for all features, and score each subset using the classification model’s performance measure. These methods can use optimization approaches such as metaheuristic algorithms [ 53 , 54 ]. Embedded methods combine the advantages of both methods including the feature selection for the model-fitting step [ 37 , 40 ].…”
Section: Related Workmentioning
confidence: 99%
“…Additionally, the distances between the distributions of each class within the overlapping areas were small, making accurate classification challenging. The accuracy achieved using the low-ranking feature combinations ranged from a minimum of 47.08 to a maximum of 54 Table 7. Classification results of the low-ranking six FPD n feature combinations.…”
Section: Optimal Sensor Selection Based On Dnfmentioning
confidence: 99%
“…GWO has excellent results in solving combinatorial optimisation problems such as vehicle scheduling, path planning, and job shop scheduling problems [23][24][25]. Still, GWO algorithms have the problem of quickly falling into the local optimum problem [26]. Tawhid et al proposed a hybrid grey wolf optimisation genetic algorithm (HGWOGA), which balances the exploration and exploitation capabilities with the grey wolf optimisation algorithm, and then, after dividing the population, the crossover operator is used on the divided sub-populations to increase the diversity of the search; finally, the mutation operator of the genetic algorithm is applied to the whole population to avoid premature maturity [27].…”
Section: Introductionmentioning
confidence: 99%