2023
DOI: 10.1007/s11831-023-09883-3
|View full text |Cite
|
Sign up to set email alerts
|

Slime Mould Algorithm: A Comprehensive Survey of Its Variants and Applications

Abstract: Meta-heuristic algorithms have a high position among academic researchers in various fields, such as science and engineering, in solving optimization problems. These algorithms can provide the most optimal solutions for optimization problems. This paper investigates a new meta-heuristic algorithm called Slime Mould algorithm (SMA) from different optimization aspects. The SMA algorithm was invented due to the fluctuating behavior of slime mold in nature. It has several new features with a unique mathematical mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 71 publications
(12 citation statements)
references
References 167 publications
0
11
0
Order By: Relevance
“…S F Gharehchopogh et al proposed an emerging metaheuristic algorithm called "slime mold algorithm" (SMA) for research and analysis. According to the results statistics, by 2020, research based on SMA has been published in major scientific and technological databases, covering four fields: hybridization, development, deformation, and optimization of SMA algorithms, with the most widespread application in optimization problem solving [ 21 ]. M Ayar et al proposed a novel feature selection method based on chaos partitioning for rapid automatic identification of arrhythmias.…”
Section: Related Workmentioning
confidence: 99%
“…S F Gharehchopogh et al proposed an emerging metaheuristic algorithm called "slime mold algorithm" (SMA) for research and analysis. According to the results statistics, by 2020, research based on SMA has been published in major scientific and technological databases, covering four fields: hybridization, development, deformation, and optimization of SMA algorithms, with the most widespread application in optimization problem solving [ 21 ]. M Ayar et al proposed a novel feature selection method based on chaos partitioning for rapid automatic identification of arrhythmias.…”
Section: Related Workmentioning
confidence: 99%
“…The strategy of the tunicates and their search mechanism in the process of finding food sources and foraging have been the main inspirations in the design of the Tunicate Swarm Algorithm (TSA) 17 . Some other swarm-based methods are White Shark Optimizer (WSO) 18 , Reptile Search Algorithm (RSA) 19 , Raccoon Optimization Algorithm (ROA) 20 , African Vultures Optimization Algorithm (AVOA) 21 , Farmland Fertility Algorithm (FFA) 22 , Slime Mould algorithm (SMA) 23 , Mountain Gazelle Optimizer (MGO) 24 , Sparrow Search Algorithm (SSA) 25 , Whale Optimization Algorithm (WOA) 26 , Artificial Gorilla Troops Optimizer (GTO) 27 , and Pelican Optimization Algorithm (POA) 28 .…”
Section: Literature Reviewmentioning
confidence: 99%
“…The algorithmic-level approach aims to enhance classification results and reduce data imbalance through optimization techniques. Several studies have investigated algorithm optimization, with notable examples including the improved harris hawk optimization and opposition-based learning (IHHOOBL) algorithm, which is specifically designed to detect communities in social networks [11]; Several algorithms have been developed to address optimization problems, including the quantum-learning, gaussian, cauchy, and tunicate swarm (QLGCTSA) algorithm, which is a generalpurpose algorithm [12]; the slime mould algorithm (SMA), designed to simulate biological wave optimization [13]; the sparrow search algorithm (SSA), developed specifically for optimization problems [14]; the tree seed algorithm (TSA), which identifies tree and seed relationships for optimization [15]; and QC-inspired metaheuristic algorithms, which aim to solve numerical optimization problems [16]. Although previous studies have shown that optimizing these algorithms, either through feature selection [17] or for improved accuracy, does not adequately address the challenge of imbalanced data arising from oversampling or undersampling.…”
Section: Introductionmentioning
confidence: 99%