2020
DOI: 10.48550/arxiv.2008.10937
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Survey on Evolutionary Neural Architecture Search

Yuqiao Liu,
Yanan Sun,
Bing Xue
et al.

Abstract: Deep Neural Networks (DNNs) have achieved great success in many applications, such as image classification, natural language processing and speech recognition. The architectures of DNNs have been proved to play a crucial role in its performance. However, designing architectures for different tasks is a difficult and time-consuming process of trial and error. Neural Architecture Search (NAS), which received great attention in recent years, can design the architecture automatically. Among different kinds of NAS … Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 153 publications
(376 reference statements)
0
10
0
Order By: Relevance
“…A similar idea is also adopted for accelerating ENAS in [46] and [47]. Despite that the idea of early stopping can considerably reduce the computational cost of ENAS approaches, it easily leads to inaccurate estimation on individual quality, especially for complicated architectures as stated in [48]. This makes it hard for early stopping based ENAS approaches to achieve competitive performance in NAS.…”
Section: B Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…A similar idea is also adopted for accelerating ENAS in [46] and [47]. Despite that the idea of early stopping can considerably reduce the computational cost of ENAS approaches, it easily leads to inaccurate estimation on individual quality, especially for complicated architectures as stated in [48]. This makes it hard for early stopping based ENAS approaches to achieve competitive performance in NAS.…”
Section: B Related Workmentioning
confidence: 99%
“…In [17], Sun et al utilized a hashing method to record both architecture information and fitness of each individual, so that the fitness value for an individual can be directly obtained if this individual has been recorded. Despite avoiding redundant individual evaluations in ENAS, these strategies still consume expensive computational cost to achieve high-quality architectures [48].…”
Section: B Related Workmentioning
confidence: 99%
“…To have a real impact, however, the resources involved are colossal -as seen in many papers, they are on the scale of hundreds to thousands of GPUs. Recent studies [9,17,18] even point out that EA and RL may not outperform random search. These drawbacks have led to the development of another paradigm: the idea that a better problem formulation could partially alleviate the hardware constraint and provide a greater chance of success in finding a good architecture.…”
Section: Mutating (Random Information Perturbation)mentioning
confidence: 99%
“…Several recent design spaces for NAS have composed generic neural network functional components while preserving inductive priors from successful state-of-the-art computer vision or natural language processing neural network architectures (Liu et al, 2020). On the other hand, as it applies to dynamical system identification, NAS with neuroevolutionary methods have typically used lower level components such as neurons and basic function compositions applied in black-box systems modeling without the benefit of physics-modeling based priors embedded in the design space (Al-Mahasneh et al, 2017;Subudhi and Jena, 2011a,b;Yang et al, 2017;Ferariu and Burlacu, 2011;Ayala et al, 2020;Hatanaka et al, 2006;Yang et al, 2017;Gaier and Ha, 2019).…”
Section: Introductionmentioning
confidence: 99%