2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2021
DOI: 10.1109/cvpr46437.2021.01082
|View full text |Cite
|
Sign up to set email alerts
|

Prioritized Architecture Sampling with Monto-Carlo Tree Search

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
4
1

Relationship

3
7

Authors

Journals

citations
Cited by 34 publications
(8 citation statements)
references
References 15 publications
0
8
0
Order By: Relevance
“…Neural architecture search (NAS) methods [18,22,23,32,36] achieve remarkable performance improvements by automatic architecture designing. However, they are computationally expensive in training intermediate architectures.…”
Section: Neural Architecture Searchmentioning
confidence: 99%
“…Neural architecture search (NAS) methods [18,22,23,32,36] achieve remarkable performance improvements by automatic architecture designing. However, they are computationally expensive in training intermediate architectures.…”
Section: Neural Architecture Searchmentioning
confidence: 99%
“…Benchmarks that provide a macro search space enable the evaluation of NAS methods beyond traditional operation sampling, extending to decisions about node connections, operation parameters, or the architecture skeleton. One such benchmark is NAS-Bench-Macro [50], which proposes a search space with 6,561 architectures. The goal of this benchmark is to search for 8 layers with a pool of 3 possible blocks.…”
Section: Neural Architecture Search Benchmarks Overviewmentioning
confidence: 99%
“…Guo et al [7] converted the search strategy into a single path one-shot (SPOS) framework with a uniform sampler to construct a simplified supernet. GreedyNAS [35] eased the burden of a supernet by encouraging it to focus more on evaluation of those potentially-good ones with a greedy search strategy while MCT-NAS [20] leveraged a Monte-Carlo tree to record the history of path quality. K-shot NAS [23] enhanced the evaluation ability by resorting to multiple supernets.…”
Section: Related Workmentioning
confidence: 99%