2007 IEEE Congress on Evolutionary Computation 2007
DOI: 10.1109/cec.2007.4424985
|View full text |Cite
|
Sign up to set email alerts
|

MSOPS-II: A general-purpose Many-Objective optimiser

Abstract: Existing evolutionary methods capable of true Many-Objective optimisation have been limited in their application: for example either initial search directions need to be specified a-priori, or the use of hypervolume limits the search in practice to less than 10 objective dimensions.This paper describes two extensions to the Multiple Single Objective Pareto Sampling (MSOPS) algorithm. The first provides automatic target vector generation, removing the requirement for initial a-priori designer intervention; and … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
44
0
1

Year Published

2014
2014
2020
2020

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 133 publications
(45 citation statements)
references
References 6 publications
(8 reference statements)
0
44
0
1
Order By: Relevance
“…is the i-th value of the j-th point in R*/ 9 /*Operation 2: update archive*/ 10 Delete duplicate candidate solutions in A; 11 Delete dominated candidate solutions in A; 12 Then, the contributing solutions in A are detected and copied to A con , where the "contributing solutions" are those closest to at least one point in R according to the definition of IGD-NS. As an example, Fig.…”
Section: A Reference Point Adaptation Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…is the i-th value of the j-th point in R*/ 9 /*Operation 2: update archive*/ 10 Delete duplicate candidate solutions in A; 11 Delete dominated candidate solutions in A; 12 Then, the contributing solutions in A are detected and copied to A con , where the "contributing solutions" are those closest to at least one point in R according to the definition of IGD-NS. As an example, Fig.…”
Section: A Reference Point Adaptation Methodsmentioning
confidence: 99%
“…On one hand, some decomposition based MOEAs such as MOGLS [9], C-MOGA [10], MSOPS-II [11], MOEA/D [12] and RVEA [13] decompose an MOP into a number of SOPs via objective function aggregations, such that the candidate solutions are able to efficiently converge to the optimum of each SOP without considering the conflicts between different objectives. On the other hand, some other decomposition based MOEAs such as MOEA/D-M2M [14], IM-MOEA [15], NSGA-III [16] and SPEA/R [17], decompose an MOP into 0000-0000/00$00.00 c ⃝ 0000 IEEE several simpler MOPs by partitioning the objective space into a number of subspaces.…”
mentioning
confidence: 99%
“…The parameters of NSGA-II, MSOPS-II, MOEA/D, MOEA/D-DE, and MOEA/D-STM were set according to [15], [20], [25], [28], and [40] Tables I and II, respectively. The setting of N weight vectors (λ 1 , . .…”
Section: B Parameter Settingsmentioning
confidence: 99%
“…Based on the above two requirements for selection, the current MOEAs can be categorized into the domination-based (see [15], [42], [44]), the indicatorbased (see [2], [3], [24], [43]), and the decomposition-based MOEAs (see [20], [21], [35], [40]). A representative of decomposition-based MOEAs is MOEA based on decomposition (MOEA/D) [40], which can be regarded as a generalization of cMOGA [31].…”
Section: Introductionmentioning
confidence: 99%
“…A fundamental example of this is that the selection mechanisms based dominance that perform well on multi-objective problems (two or three objectives) [18,20,79], often do not perform well when four or more problem objectives are considered as shown in [3,36,37,41,46,50,63,73,103]. Selection based on dominance is ine cient at producing a strong selection pressure toward the Pareto-optimal front in the presence of many objectives, as throughout the optimisation process it is likely that the entire population will consist of entirely non-dominated solutions.…”
Section: Introductionmentioning
confidence: 99%