Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2021
DOI: 10.1109/tcyb.2020.2979930
|View full text |Cite
|
Sign up to set email alerts
|

Solving Large-Scale Multiobjective Optimization Problems With Sparse Optimal Solutions via Unsupervised Neural Networks

Abstract: Due to the curse of dimensionality of search space, it is extremely difficult for evolutionary algorithms to approximate the optimal solutions of large-scale multiobjective optimization problems (LMOPs) by using a limited budget of evaluations. If the Pareto optimal subspace is approximated during the evolutionary process, the search space can be reduced and the difficulty encountered by evolutionary algorithms can be highly alleviated. Following the above idea, this paper proposes an evolutionary algorithm to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
44
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
4
1

Relationship

2
7

Authors

Journals

citations
Cited by 169 publications
(44 citation statements)
references
References 52 publications
0
44
0
Order By: Relevance
“…MAST-Net [24] is a memetic algorithm specially designed for NRPs, which has shown superiority on small-scale networks. SparseEA [63] and MOEA/PSL [60] have been validated to be effective on large-scale MOPs, and are implemented on the evolutionary multiobjective optimization platform PlatEMO [64]. For the comparative methods, we use identical parameters suggested in the original papers and codes.…”
Section: F Application To Real-world Networkmentioning
confidence: 99%
“…MAST-Net [24] is a memetic algorithm specially designed for NRPs, which has shown superiority on small-scale networks. SparseEA [63] and MOEA/PSL [60] have been validated to be effective on large-scale MOPs, and are implemented on the evolutionary multiobjective optimization platform PlatEMO [64]. For the comparative methods, we use identical parameters suggested in the original papers and codes.…”
Section: F Application To Real-world Networkmentioning
confidence: 99%
“…en, the results can be sorted by adaptation value. When the number of noninferior solutions is less than the number of archive size N, copy the first N − |P t+1 | individuals X j with F(X j ) ≥ 1 from the resulting ordered list to P t+1 [11], and | • | corresponds to the number of elements of a set. (4) Archive truncation procedure: when the number of noninferior solutions exceeds the number of archive…”
Section: Principle Of Spea2mentioning
confidence: 99%
“…To solve the BOCC with high nonlinearity, the metaheuristic-based multiobjective optimization algorithms show stronger global searching ability than the traditional mathematical optimization methods [10][11][12][13][14][15]. Widely used multiobjective algorithms include the improved strength Pareto evolutionary algorithm (SPEA2) [16] and a nondominated sorting genetic algorithm II (NSGA-II) [17].…”
Section: Introductionmentioning
confidence: 99%
“…Evolutionary algorithms are among the most popular metaheuristics and have been demonstrated to be effective in various large-scale optimization [38,39,46]. Hence, we develop RACO under the framework of the evolutionary algorithm named ant colony optimization (ACO), which has also exhibited effectiveness in solving DVRPs as suggested in [29].…”
Section: Framework Of the Proposed Racomentioning
confidence: 99%