2006
DOI: 10.1007/11844297_92
|View full text |Cite
|
Sign up to set email alerts
|

A Particle Swarm Optimizer for Constrained Numerical Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
29
0

Year Published

2007
2007
2015
2015

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 26 publications
(29 citation statements)
references
References 4 publications
0
29
0
Order By: Relevance
“…The constraint-handling was made with an adaptive penalty function. Cagnina, Esquivel and Coello [12] presented a combination of global-local best PSO with inertia weight. The authors also added a dynamic mutation operator to promote diversity.…”
Section: Pso Conceptsmentioning
confidence: 99%
See 1 more Smart Citation
“…The constraint-handling was made with an adaptive penalty function. Cagnina, Esquivel and Coello [12] presented a combination of global-local best PSO with inertia weight. The authors also added a dynamic mutation operator to promote diversity.…”
Section: Pso Conceptsmentioning
confidence: 99%
“…For the first experiment, we included some representative convergence graphs as to show the effect of the modifications in our proposed PSO. The state-of-the-art approaches selected for comparison in the second experiment were the PSO proposed by Toscano & Coello [8], labeled as approach "A", Li et al PSO approach [11] called approach "B" (they reported results just in seven problems), Lu & Chen approach [13] named approach "C" and finally Cagnina et al approach [12] called approach "D".…”
Section: Experimental Designmentioning
confidence: 99%
“…There exits a tradeoff between the computation cost and storage capacity of nondominated solutions. The simplest way could customize an archive large enough and adopt elitist policy to update the archive [43]. e.g.…”
Section: (2) External Archivementioning
confidence: 99%
“…The nearest neighbor particles is organized together according to the calculated distance of the first objective function's fitness value, which could be also considered as a cluster operation, furthermore, the local optima among the neighbors in terms of the fitness value of the second objective function. In [42,43,59], Hu used a extended memory as the candidate pool which has the same selection mechanism.…”
Section: Best Particle Selectionmentioning
confidence: 99%
“…For that sake, our approach contains a constraint-handling technique as well as a mechanism to update the velocity and position of the particles [2], which is extended by adding to it a bi-population and a shake-mechanism as a way to avoid premature convergence.…”
Section: Introductionmentioning
confidence: 99%