2011
DOI: 10.1016/j.cam.2010.11.021
|View full text |Cite
|
Sign up to set email alerts
|

A rank based particle swarm optimization algorithm with dynamic adaptation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
69
0
4

Year Published

2012
2012
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 61 publications
(74 citation statements)
references
References 17 publications
1
69
0
4
Order By: Relevance
“…Their definitions as well as source codes can be obtained through online sources. 1 Comparative results for different algorithms are also publicly available. 2 PSO-NBA was applied on the 50-and 100-dimensional instances of the test problems, adopting the experimental setting in [14].…”
Section: Further Experimentsmentioning
confidence: 99%
See 1 more Smart Citation
“…Their definitions as well as source codes can be obtained through online sources. 1 Comparative results for different algorithms are also publicly available. 2 PSO-NBA was applied on the 50-and 100-dimensional instances of the test problems, adopting the experimental setting in [14].…”
Section: Further Experimentsmentioning
confidence: 99%
“…There are also works that propose rank-based PSO variants in the literature. In [1] the presented algorithm uses only a fraction of the particles to update velocity. This approach is solely based on the global (gbest) PSO model, neglecting the neighborhoods.…”
Section: Introductionmentioning
confidence: 99%
“…A comprehensive list of many of the variants can be found in [21]. The variants adopted for comparison in this work are Rank based PSO (PSO rank ) in [1], Adaptive Inertia Weight PSO (AIWPSO) in [18], Adaptive PSO (APSO) in [2], Natural Exponential inertia weight PSO (e 1 -PSO) in [7], Decreasing exponential function PSO (def-PSO) in [8] and Chaotic Random Inertia Weight PSO (CRIW-PSO) in [10]. All these variants have proved to be superior to their competitors in their various capacities.…”
Section: A Pso Variantsmentioning
confidence: 99%
“…Over the years, many researchers have made tremendous efforts to improve on the effectiveness, efficiency and robustness of PSO technique in solving optimization problems. Researches in this direction are the introduction of V max [6], inertia weight [23], constriction factor [4] into PSO, as well as improvements on the inertia weight [1,3,7,8,11,17,18,22,24], PSO with mutation operators [2,5,11,15,16], hybridization of PSO with other algorithms [19] and development of other variants of PSO [21]. Despite these improvements on PSO, virtually none of the existing variants of PSO have been able to solve optimization problems with high dimension up to 2000.…”
Section: Introductionmentioning
confidence: 99%
“…The neural network classifier is trained using particle swarm optimization algorithm. In [7,21,22] the corresponding authors demonstrate hybrid PSO based neural network training algorithm. A population based search algorithm is used to search out optimized synaptic weights for a multilayer perceptron.…”
Section: Neuro Swarm Optimization Techniquementioning
confidence: 99%