2020
DOI: 10.48550/arxiv.2006.02903
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Comprehensive Survey of Neural Architecture Search: Challenges and Solutions

Abstract: Deep learning has made major breakthroughs and progress in many fields. This is due to the powerful automatic representation capabilities of deep learning. It has been proved that the design of the network architecture is crucial to the feature representation of data and the final performance. In order to obtain a good feature representation of data, the researchers designed various complex network architectures. However, the design of the network architecture relies heavily on the researchers' prior knowledge… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
39
0
3

Year Published

2021
2021
2022
2022

Publication Types

Select...
5
5

Relationship

1
9

Authors

Journals

citations
Cited by 38 publications
(42 citation statements)
references
References 128 publications
(280 reference statements)
0
39
0
3
Order By: Relevance
“…ture design is that the search space for CNN architectures is very large [18]. Two fields have derived from the problem: i) neural architecture search (NAS), that develops mechanisms for searching for the best combination of layers [25] and ii) channel number search (CNS), which look for the best distribution of filters given an initial architecture [2,22].…”
Section: Related Workmentioning
confidence: 99%
“…ture design is that the search space for CNN architectures is very large [18]. Two fields have derived from the problem: i) neural architecture search (NAS), that develops mechanisms for searching for the best combination of layers [25] and ii) channel number search (CNS), which look for the best distribution of filters given an initial architecture [2,22].…”
Section: Related Workmentioning
confidence: 99%
“…Inspired by the recent NAS works (Elsken et al, 2019;Ren et al, 2020;Guo et al, 2020;, we adopt an evolutionary algorithm (EA) to search the model. Previously Real et al (2017) utilized an evolutionary method in NAS but they trained each candidate model from scratch which is costly and inefficient.…”
Section: Evolutionary Searchmentioning
confidence: 99%
“…Commonly, this is done with deep neural networks (DNNs) 1-13 , parametric models, or kernel models, such as Gaussian Processes (GPs) or kernel ridge regression (KRR) 1, [14][15][16][17][18][19][20][21][22][23][24][25] . While both methodologies have proven to be flexible enough, GPs require less "tunning" compared to NNs, where the search for an optimal architecture could be computationally demanding 26,27 . One of the advantages of GPs is their ability to quantify the uncertainty in their prediction, which is commonly used in applications where noisy data or the optimization of f black-box functions.…”
Section: Introductionmentioning
confidence: 99%