2023
DOI: 10.1109/access.2023.3234810
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Fidelity Neural Architecture Search With Knowledge Distillation

Abstract: Neural architecture search (NAS) targets at finding the optimal architecture of a neural network for a problem or a family of problems. Evaluations of neural architectures are very time-consuming. One of the possible ways to mitigate this issue is to use low-fidelity evaluations, namely training on a part of a dataset, fewer epochs, with fewer channels, etc. In this paper, we propose a Bayesian multi-fidelity method for neural architecture search: MF-KD. The method relies on a new approach to low-fidelity eval… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 53 publications
0
1
0
Order By: Relevance
“…This approach minimizes the need to evaluate multiple models, thereby accelerating the search process. Knowledge distillation without an ensemble of teachers was applied for NAS in [25] for image classification.…”
Section: Search Methodsmentioning
confidence: 99%
“…This approach minimizes the need to evaluate multiple models, thereby accelerating the search process. Knowledge distillation without an ensemble of teachers was applied for NAS in [25] for image classification.…”
Section: Search Methodsmentioning
confidence: 99%