2021
DOI: 10.48550/arxiv.2102.10490
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Stronger NAS with Weaker Predictors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
1
0
Order By: Relevance
“…Li et al propose block-wise progressive NAS [19,20] that consider the architectures is built by sequential blocks and search the architecture block by block. SGAS [21], GreedyNAS [36], and [17,41,34] progressively shrink the search space by dropping unpromising candidates. These progressive NAS methods require a much smaller exploration size, but their greedy nature hampers their search accuracy.…”
Section: Explore Wisely Early Methods Adopt Reinforcementmentioning
confidence: 99%
“…Li et al propose block-wise progressive NAS [19,20] that consider the architectures is built by sequential blocks and search the architecture block by block. SGAS [21], GreedyNAS [36], and [17,41,34] progressively shrink the search space by dropping unpromising candidates. These progressive NAS methods require a much smaller exploration size, but their greedy nature hampers their search accuracy.…”
Section: Explore Wisely Early Methods Adopt Reinforcementmentioning
confidence: 99%
“…In general, reduced training can be found in many NAS works (Pham et al 2018;Zhou et al 2020), and different proxies have been proposed, e.g. searching for a model on a smaller dataset and then transferring the architecture to the larger target dataset (Real et al 2019;Mehrotra et al 2021), or incorporating a predictor into the search process (Wei et al 2020;Dudziak et al 2020;Wu et al 2021;Wen et al 2019).…”
Section: Related Workmentioning
confidence: 99%