2020
DOI: 10.1007/978-3-030-58601-0_12
|View full text |Cite
|
Sign up to set email alerts
|

A Generic Graph-Based Neural Architecture Encoding Scheme for Predictor-Based NAS

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
34
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 44 publications
(34 citation statements)
references
References 4 publications
0
34
0
Order By: Relevance
“…These benchmarks offer the opportunities of training predictors, i.e., the input is an encoded neural architecture and the output is the network accuracy [366]. There have been efforts of exploring the internal relationship between architectures (an effective way is to design a good encoding method for the architectures [369], [371], [380]) for better prediction performance, in particular under a limited number of sampled architectures [75], [385]. The trained predictors can inspire algorithms beyond the toy search spaces [171], [371].…”
Section: Predictor-based Search Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…These benchmarks offer the opportunities of training predictors, i.e., the input is an encoded neural architecture and the output is the network accuracy [366]. There have been efforts of exploring the internal relationship between architectures (an effective way is to design a good encoding method for the architectures [369], [371], [380]) for better prediction performance, in particular under a limited number of sampled architectures [75], [385]. The trained predictors can inspire algorithms beyond the toy search spaces [171], [371].…”
Section: Predictor-based Search Methodsmentioning
confidence: 99%
“…Predictor-based Search [140], [191], [366] [73], [75], [210], [236], [367], [368], [369], [370], [371], [372], [373], [374], [375], [376], [377], [378], [379], [380] TABLE 2 Summary of search strategies. We categorize the individual heuristic search strategies into three parts, i.e., using reinforcement learning, using evolutionary algorithms, and using other methods (e.g., Bayesian optimization, etc.).…”
Section: Individual -Reinforcementmentioning
confidence: 99%
“…Recent works explored performance prediction based on architectural properties, i.e., the network topology and the model size (Liu et al, 2018;Long et al, 2019;Wen et al, 2020;Ning et al, 2020). For instance, Hardware-Aware Transformer (HAT) encoded architectures into feature vectors and predicted the latency with a Multilayer Perceptron (MLP) for the target hardware.…”
Section: Related Workmentioning
confidence: 99%
“…Also, some authors have explored techniques to speed up the performance evaluation [8,9]. The mixed search space problem has been faced from multiple perspectives, ranging from tailored encoding [33,34], and specific operations [25], to mixed (hybrid) approaches [35].…”
Section: Neural Architecture Searchmentioning
confidence: 99%