2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2020
DOI: 10.1109/cvpr42600.2020.00188
|View full text |Cite
|
Sign up to set email alerts
|

A Semi-Supervised Assessor of Neural Architectures

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
56
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 52 publications
(63 citation statements)
references
References 14 publications
0
56
0
Order By: Relevance
“…Neural Architecture Search (NAS) NAS is an automated architecture search process which aims to overcome the suboptimality of manual architecture designs when exploring the extensive search space. NAS methods can be roughly categorized into reinforcement learning-based methods (Zoph & Le, 2017;Pham et al, 2018), evolutionary algorithm-based methods Lu et al, 2020), and gradient-based methods Cai et al, 2019;Luo et al, 2018;Dong & Yang, 2019b;Chen et al, 2021;Xu et al, 2020;Fang et al, 2020). Among existing approaches, perhaps the most relevant approach to ours is NAO (Luo et al, 2018), which maps DAGs onto a continuous latent embedding space.…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…Neural Architecture Search (NAS) NAS is an automated architecture search process which aims to overcome the suboptimality of manual architecture designs when exploring the extensive search space. NAS methods can be roughly categorized into reinforcement learning-based methods (Zoph & Le, 2017;Pham et al, 2018), evolutionary algorithm-based methods Lu et al, 2020), and gradient-based methods Cai et al, 2019;Luo et al, 2018;Dong & Yang, 2019b;Chen et al, 2021;Xu et al, 2020;Fang et al, 2020). Among existing approaches, perhaps the most relevant approach to ours is NAO (Luo et al, 2018), which maps DAGs onto a continuous latent embedding space.…”
Section: Related Workmentioning
confidence: 99%
“…GDAS (Dong & Yang, 2019b) tackles this by optimizing sampled sub-graphs of DAG. PC-DARTS (Xu et al, 2020) reduces GPU overhead and search time by partially selecting channel connections. However, due to the task-specific nature of those methods, they should be retrained from the scratch for each new unseen task repeatedly and each will take a few GPU hours.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations