2020
DOI: 10.1109/tevc.2019.2918140
|View full text |Cite
|
Sign up to set email alerts
|

An Evolutionary Algorithm for Large-Scale Sparse Multiobjective Optimization Problems

Abstract: In the last two decades, a variety of different types of multi-objective optimization problems (MOPs) have been extensively investigated in the evolutionary computation community. However, most existing evolutionary algorithms encounter difficulties in dealing with MOPs whose Pareto optimal solutions are sparse (i.e., most decision variables of the optimal solutions are zero), especially when the number of decision variables is large. Such large-scale sparse MOPs exist in a wide range of applications, for exam… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
63
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 271 publications
(82 citation statements)
references
References 69 publications
0
63
0
Order By: Relevance
“…As reported in [42], a hybrid representation of solution is effective for solving sparse LMOPs, where each solution is represented by a binary vector denoting the mask and a real vector denoting the decision variables. Following this idea, the proposed MOEA/PSL adopts RBM and DAE to learn the Pareto optimal subspace from the solutions with hybrid representation.…”
Section: B Existing Pareto Optimal Subspace Learning Approachesmentioning
confidence: 99%
See 3 more Smart Citations
“…As reported in [42], a hybrid representation of solution is effective for solving sparse LMOPs, where each solution is represented by a binary vector denoting the mask and a real vector denoting the decision variables. Following this idea, the proposed MOEA/PSL adopts RBM and DAE to learn the Pareto optimal subspace from the solutions with hybrid representation.…”
Section: B Existing Pareto Optimal Subspace Learning Approachesmentioning
confidence: 99%
“…Four state-of-the-art MOEAs are selected as baselines in the experiments, namely, LMEA [8], WOF-SMPSO [11], MaOEA-IT [41], and SparseEA [42]. LMEA is a divide-and-conquer MOEA tailored for LMOPs, which divides the decision variables into convergence-related variables and diversity-related variables and optimizes them separately.…”
Section: A Comparative Algorithmsmentioning
confidence: 99%
See 2 more Smart Citations
“…There are also some attempts to customize search operators to improve the scalability of certain MOEAs [96,97] . The work in [98] focuses on a specific type of large-scale problems named large-scale sparse MOPs, where most decision variables of the optimal solutions are zero.…”
Section: Enhanced Search-based Large-scale Moeasmentioning
confidence: 99%