2022
DOI: 10.1007/s42235-022-00207-y
|View full text |Cite
|
Sign up to set email alerts
|

A Hybrid Moth Flame Optimization Algorithm for Global Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
9
1

Relationship

1
9

Authors

Journals

citations
Cited by 38 publications
(14 citation statements)
references
References 75 publications
0
9
0
Order By: Relevance
“…In Sahoo and Saha [ 60 ], the authors proposed a new hybrid metaheuristic algorithm called h-MFOBOA by integrating MFO and BOA. To enhance the performance and solution quality of the MFO algorithm, the authors added the global and local phases of BOA after the position update phase of the MFO algorithm.…”
Section: Variants Of Mfo Algorithmmentioning
confidence: 99%
“…In Sahoo and Saha [ 60 ], the authors proposed a new hybrid metaheuristic algorithm called h-MFOBOA by integrating MFO and BOA. To enhance the performance and solution quality of the MFO algorithm, the authors added the global and local phases of BOA after the position update phase of the MFO algorithm.…”
Section: Variants Of Mfo Algorithmmentioning
confidence: 99%
“…As real-world optimization problems have grown in complexity and difficulty over the last several decades, the need for optimization methods has become more apparent. The performance of multi-modal, non-continuous, and non-differentiable issues is well-served by the meta-heuristic algorithm family of stochastic search approaches [ 3 ]. Two primary categories may be used to classify optimization approaches: gradient-based techniques and Mas [ 4 ].…”
Section: Introductionmentioning
confidence: 99%
“…Ghafori and Gharehchopogh [ 18 ] conducted a comprehensive survey of Spotted Hyena Optimizer (SHO) and obtained the direction of improvement of the SHO algorithm. Meta-heuristic algorithms can provide feasible results for highly complex constrained problems, i.e., Grey Wolf Optimizer (GWO) [ 19 ], Harris Hawks Optimization (HHO) [ 20 ], Flower Pollination Algorithm (FPA) [ 21 ], Social Network Search (SNS) [ 22 ], Gravitational Search Algorithm (GSA) [ 23 ], Dragonfly Algorithm (DA) [ 24 ], Alpine Skiing Optimization (ASO) [ 25 ], Elite Opposition-based Learning and Chaotic k -best Gravitational Search Strategy based Grey Wolf Optimizer (EOCSGWO) [ 26 ], Artificial Bee Colony (ABC) [ 27 ], Teaching–Learning-Based Optimization (TLBO) [ 28 ], hybrid Moth Flame Optimization and Butterfly Optimization Algorithm (h-MFOBOA) [ 29 ].…”
Section: Introductionmentioning
confidence: 99%