Proceedings of the Genetic and Evolutionary Computation Conference 2019
DOI: 10.1145/3321707.3321729
|View full text |Cite
|
Sign up to set email alerts
|

NSGA-Net

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 339 publications
(6 citation statements)
references
References 14 publications
0
5
0
Order By: Relevance
“…Hu et al (2021) proposed a new performance estimation metric named random-weight evaluation (RWE) to quantify the quality of CNNs. Lu et al (2018) proposed NSGANet, an evolutionary algorithm that combines prior knowledge from handcrafted architectures with an exploration comprising crossover and mutation. Some software packages provide search functions, such as pyGPGO and Optunity (Bergstra et al, 2011), Hyperopt-Sklearn (Bergstra et al, 2015), etc.…”
Section: (C) Other Improved Nas Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Hu et al (2021) proposed a new performance estimation metric named random-weight evaluation (RWE) to quantify the quality of CNNs. Lu et al (2018) proposed NSGANet, an evolutionary algorithm that combines prior knowledge from handcrafted architectures with an exploration comprising crossover and mutation. Some software packages provide search functions, such as pyGPGO and Optunity (Bergstra et al, 2011), Hyperopt-Sklearn (Bergstra et al, 2015), etc.…”
Section: (C) Other Improved Nas Methodsmentioning
confidence: 99%
“…Most neural networks can be divided into three types: (a) professional knowledge is required for manual design, such as VGG (Ferentinos, 2018) and ResNet (He et al, 2016); (b) semiautomatic design methods like genetic neural networks (Xie and Yuille, 2017), hierarchical evolution , and others; and (c) fully automatic design, such as when Google introduces the neural architecture search (NAS) concept (Zoph and Le, 2016), which has received considerable attention (Baker et al, 2016;Lu et al, 2018). NAS can search for the best hyperparameters to perform better than manual design.…”
Section: Introductionmentioning
confidence: 99%
“… Our data is relevant for researchers developing tools for NN design. Such tools include neural architecture search [6] , [7] , [8] and methods for NN fitness prediction and training termination [9] , [10] , [11] . Learning curve data is essential to the development of methods for NN fitness modeling and prediction [3] , [12] .…”
Section: Specifications Tablementioning
confidence: 99%
“…[23] 96.25 15.7 300 EA CNN-GA [38] 96.78 2.9 35 EA CGP-CNN [35] 94.02 1.7 27 EA AE-CNN [37] 95.7 2.0 27 EA NSGANetV1-A2 [27] 97.35 0.9 27 EA AE-CNN+E2EPP [36] 94.70 4.3 7 EA NSGA-NET [26] 97.25 [14] 77.90 1.7 -manual DenseNet-BC [15] 82.82 25.6 -manual ShuffleNet [43] 77.14 1.06 -manual PNAS [22] 80.47 3.2 225 SMBO MetaQNN [1] 72.86 11.2 90 RL ENAS [29] 80.57 4.6 0.45 RL AmoebaNet-A [30] 81.07 3.2 3150 EA Large-scale Evo. [31] 77.00 40.4 2750 EA CNN-GA [38] 79.47 4.1 40 EA AE-CNN [37] 79.15 5.4 36 EA NSGANetV1-A2 [27] 82.58 0.9 27 EA Genetic CNN [39] 70.95 -17 EA AE-CNN+E2EPP [36] 77.98 20.9 10 EA NSGA-NET [26] 79. ment for the selection method. The nearest-neighbors size for the novelty search is 5.…”
Section: Architecturementioning
confidence: 99%