2020
DOI: 10.1007/978-3-030-58517-4_32
|View full text |Cite
|
Sign up to set email alerts
|

Single Path One-Shot Neural Architecture Search with Uniform Sampling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
513
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 440 publications
(513 citation statements)
references
References 16 publications
0
513
0
Order By: Relevance
“…It explores a layer-wise space where each layer of the CNN can choose a different block, and the learning process is done by training the stochastic super net. The SPOS [163] uses in a similar way the supernet concept to perform NAS, where the constraints such as latency and number of FLOPs are applied. The HURRICANE framework [164] performs a two-stage search algorithm for the automatic hardware-aware NAS.…”
Section: E Hardware-aware Neural Architecture Searchmentioning
confidence: 99%
“…It explores a layer-wise space where each layer of the CNN can choose a different block, and the learning process is done by training the stochastic super net. The SPOS [163] uses in a similar way the supernet concept to perform NAS, where the constraints such as latency and number of FLOPs are applied. The HURRICANE framework [164] performs a two-stage search algorithm for the automatic hardware-aware NAS.…”
Section: E Hardware-aware Neural Architecture Searchmentioning
confidence: 99%
“…Because inefficient search strategies require a large number of GPUs, many NAS methods cannot be implemented given limited computational resources. To address these challenges, much recent work dedicates to developing effective methods which can reduce the computational costs of performance evaluation, e.g., surrogate-assisted evolutionary algorithms (SAEAs) [33,42,43], information reuse [44,45], one-shot neural architecture search [46][47][48][49][50], among many others.…”
Section: Neural Architecture Searchmentioning
confidence: 99%
“…Therefore, various techniques have been suggested to accelerate the fitness evaluation, such as information reuse [44,47] and SAEAs [88]. SAEAs have been popular to solve computationally expensive optimization problems, which use cheap classification and regression models, e.g., radial basis function networks (RBFNs) [89,90] and Gaussian process (GP) models [91,92], to replace the expensive fitness evaluation [93].…”
Section: Nas Based On Easmentioning
confidence: 99%
See 2 more Smart Citations