Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence 2020
DOI: 10.24963/ijcai.2020/659
|View full text |Cite
|
Sign up to set email alerts
|

NSGA-Net: Neural Architecture Search using Multi-Objective Genetic Algorithm (Extended Abstract)

Abstract: Convolutional neural networks (CNNs) are the backbones of deep learning paradigms for numerous vision tasks. Early advancements in CNN architectures are primarily driven by human expertise and elaborate design. Recently, neural architecture search (NAS) was proposed with the aim of automating the network design process and generating task-dependent architectures. This paper introduces NSGA-Net -- an evolutionary search algorithm that explores a space of potential neural network architectures in three s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
193
0
1

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 112 publications
(195 citation statements)
references
References 29 publications
1
193
0
1
Order By: Relevance
“…To solve the two-level optimization problem, a genetic algorithm was used for the optimization of and Stochastic Gradient Descent–Momentum (SGD-M) was used for the optimization of . The genetic algorithm has been widely used for NAS, and it has high robustness in many applications [ 42 , 43 , 44 , 45 ]. By using the genetic algorithm, the original traversing each structure to select the best problem was transformed into optimizing in a large search space, which efficiently traversed the space.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…To solve the two-level optimization problem, a genetic algorithm was used for the optimization of and Stochastic Gradient Descent–Momentum (SGD-M) was used for the optimization of . The genetic algorithm has been widely used for NAS, and it has high robustness in many applications [ 42 , 43 , 44 , 45 ]. By using the genetic algorithm, the original traversing each structure to select the best problem was transformed into optimizing in a large search space, which efficiently traversed the space.…”
Section: Methodsmentioning
confidence: 99%
“…where w . The genetic algorithm has been widely used for NAS, and it has high robustness in many applications [42][43][44][45]. By using the genetic algorithm, the original traversing each structure to select the best problem was transformed into optimizing in a large search space, which efficiently traversed the space.…”
Section: Glance Network Based On Nas For Speed Improvementmentioning
confidence: 99%
“…Rather than generating the entire CNNs, the micro-search space [46] has also been successfully employed by many recent EA-based NAS algorithms [83][84][85][86][87]. Real et al [85] propose an extension of the large-scale evolution [73], called AmoebaNet, which has achieved better results on ImageNet compared with hand-designed methods for the first time.…”
Section: Nas Based On Easmentioning
confidence: 99%
“…One of the earliest evolutionary multi-objective methods to design CNNs is NEMO [96], which simultaneously optimizes classification performance and inference time of a network based on NSGA-II [97]. Inspired by NEMO, Lu et al [84] consider classification error and computational complexity as the two objectives. In addition, they empirically test multiple computational complexity metrics to measure the inference time of a network containing the number of active layers, the number of activating connections between layers, and the number of floating-point operations (FLOPs).…”
Section: Nas Based On Easmentioning
confidence: 99%
See 1 more Smart Citation