2023
DOI: 10.1007/978-3-031-36622-2_14
|View full text |Cite
|
Sign up to set email alerts
|

An Atomic Retrospective Learning Bare Bone Particle Swarm Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 17 publications
0
1
0
Order By: Relevance
“…As the dimension increases, the distance between valid data points becomes extremely dispersed, thus affecting the convergence speed and stability of the algorithm. When solving high-dimensional optimization problems, although the Honey Badger Algorithm (HBA) [32] and Atomic Retrospective Learning Bare Bone Particle Swarm Optimization (ARBBPSO) [33] converged fast in local search, the global search capability is insufficient and it is easy to fall into local optimums. Due to the insufficient exploitation ability, the Chimp optimization algorithm (ChOA) [34] which was inspired by behavior of the chimp showed low convergence accuracy in high-dimensional optimization problems.…”
Section: Introductionmentioning
confidence: 99%
“…As the dimension increases, the distance between valid data points becomes extremely dispersed, thus affecting the convergence speed and stability of the algorithm. When solving high-dimensional optimization problems, although the Honey Badger Algorithm (HBA) [32] and Atomic Retrospective Learning Bare Bone Particle Swarm Optimization (ARBBPSO) [33] converged fast in local search, the global search capability is insufficient and it is easy to fall into local optimums. Due to the insufficient exploitation ability, the Chimp optimization algorithm (ChOA) [34] which was inspired by behavior of the chimp showed low convergence accuracy in high-dimensional optimization problems.…”
Section: Introductionmentioning
confidence: 99%