2020
DOI: 10.1007/978-3-030-58526-6_39
|View full text |Cite
|
Sign up to set email alerts
|

Neural Predictor for Neural Architecture Search

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
120
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 107 publications
(130 citation statements)
references
References 7 publications
0
120
0
Order By: Relevance
“…Another way to speed up the submodel evaluation that has not been explored yet for Graph-NAS is to use performance prediction [68][69][70] instead of training all generated models to obtain the performance metrics. However, this method entails the use of hundreds of GNN performance distribution and graph data characteristics to build a neural predictor.…”
Section: Performance Evaluationmentioning
confidence: 99%
“…Another way to speed up the submodel evaluation that has not been explored yet for Graph-NAS is to use performance prediction [68][69][70] instead of training all generated models to obtain the performance metrics. However, this method entails the use of hundreds of GNN performance distribution and graph data characteristics to build a neural predictor.…”
Section: Performance Evaluationmentioning
confidence: 99%
“…Recent works explored performance prediction based on architectural properties, i.e., the network topology and the model size (Liu et al, 2018;Long et al, 2019;Wen et al, 2020;Ning et al, 2020). For instance, Hardware-Aware Transformer (HAT) encoded architectures into feature vectors and predicted the latency with a Multilayer Perceptron (MLP) for the target hardware.…”
Section: Related Workmentioning
confidence: 99%
“…Our method significantly accelerates NAS through pairwise ranking and search space pruning. Baker et al, 2018;Wen et al, 2020;Wei et al, 2020), and searching over a continuous space (Liu et al, 2019;. Unfortunately, these approaches still suffer from the high cost of predicting the performance of each candidate architecture.…”
Section: Introductionmentioning
confidence: 99%
“…Weight sharing algorithms have become popular due to their computational efficiency [2,38,10,79,77,51,78]. Recent advances in performance prediction [65,46,59,74,39,69,55,67] and other iterative techniques [14,45] have reduced the runtime gap between iterative and weight sharing techniques. For detailed surveys on NAS, we suggest referring to [13,71].…”
Section: Related Workmentioning
confidence: 99%