Proceedings of the 2004 Congress on Evolutionary Computation (IEEE Cat. No.04TH8753)
DOI: 10.1109/cec.2004.1331061
|View full text |Cite
|
Sign up to set email alerts
|

Covering Pareto-optimal fronts by subswarms in multi-objective particle swarm optimization

Abstract: Abstract-Covering the whole set of Pareto-optimal solutions is a desired task of multi-objective optimization methods. Because in general it is not possible to determine this set, a restricted amount of solutions are typically delivered in the output to decision makers. In this paper, we propose a new method using multi-objective particle swarm optimization to cover the Paretooptimal front. The method works in two phases. In phase 1 the goal is to obtain a good approximation of the Pareto-front. In a second ru… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
23
0

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 63 publications
(24 citation statements)
references
References 10 publications
1
23
0
Order By: Relevance
“…Clustering is applied on the elite particle in the archive, in the case the maximum archive size is reach and the extra particles will be moved. Hsieh [46] also introduced the clustering method to construct the external archive. In the further work, Mostaghim [47] added a multi-level subdivision scheme iterative division of the search space into subspaces (boxes).…”
Section: (2) External Archivementioning
confidence: 99%
See 1 more Smart Citation
“…Clustering is applied on the elite particle in the archive, in the case the maximum archive size is reach and the extra particles will be moved. Hsieh [46] also introduced the clustering method to construct the external archive. In the further work, Mostaghim [47] added a multi-level subdivision scheme iterative division of the search space into subspaces (boxes).…”
Section: (2) External Archivementioning
confidence: 99%
“…A turbulence factor, which is a random value to the current position, is added to the position updated equation. Based on a further improvement is proposed in [46]. The optimization process is achieved in two steps [50].…”
Section: Density Metric and Diversity Maintainingmentioning
confidence: 99%
“…If there are multiple (competing) objectives evaluation of the optimality becomes more complicated. There are generally two approaches to multi-objective optimization: 1) combining fitness functions and 2) referring to a Pareto front [9,19,20]. A classical way of combining multiple objectives into a single fitness function is a weighted sum of fitness functions -one from each objectivewhere the result of the overall optimization can depend on the choice of weighting.…”
Section: Metaheuristic Optimization Methodsmentioning
confidence: 99%
“…They also introduced a new method which uses the property of moving particles in MOPSO to divide the population into sub-swarms (Mostaghim and Teich, 2004), trying to cover the gaps between the non-dominated solutions found in the initial run. Wei and Wang (2006) proposed a novel MOPSO algorithm in which a three-parent crossover operator was suggested in order to move the solutions toward the feasible region, and a dynamically changing inertia weight was designed to keep the diversity of the swarm and escape from local optima.…”
Section: Introductionmentioning
confidence: 99%