Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence 2021
DOI: 10.24963/ijcai.2021/392
|View full text |Cite
|
Sign up to set email alerts
|

Accelerating Neural Architecture Search via Proxy Data

Abstract: Despite the increasing interest in neural architecture search (NAS), the significant computational cost of NAS is a hindrance to researchers. Hence, we propose to reduce the cost of NAS using proxy data, i.e., a representative subset of the target data, without sacrificing search performance. Even though data selection has been used across various fields, our evaluation of existing selection methods for NAS algorithms offered by NAS-Bench-1shot1 reveals that they are not always appropriate for NAS and a new se… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…The current state of the art in the automatic design of CSNN architectures are the works of Kim et al [19] and AutoSNN by Na et al [32]. Both works focus on Neural Architecture Search (NAS), with an evolutionary search component implemented in AutoSNN, and attain state-of-the-art performances in the CIFAR-10, CIFAR-100 [21], and TinyImageNet datasets.…”
Section: Related Workmentioning
confidence: 99%
“…The current state of the art in the automatic design of CSNN architectures are the works of Kim et al [19] and AutoSNN by Na et al [32]. Both works focus on Neural Architecture Search (NAS), with an evolutionary search component implemented in AutoSNN, and attain state-of-the-art performances in the CIFAR-10, CIFAR-100 [21], and TinyImageNet datasets.…”
Section: Related Workmentioning
confidence: 99%
“…EcoNAS [66] explores four common reduction factors -the number of channels, the resolution of input images, the number of training epochs, and the sample ratio of the full training set -and determine which one of these proxies can be used to reliably estimate the final test accuracy. Na et al [44] show that it is possible to only use a subset of the target dataset for execute NAS and propose a novel proxy dataset selection algorithm.…”
Section: Appendices A1 Related Work A11 Neural Architecture Searchmentioning
confidence: 99%
“…The family of architecture comparators replaces the deterministic evaluation of neural architectures with a relativistic approach that compares two architectures and determines which one yields better performance [14,60]. Apart from weight sharing and performance prediction, some works propose more general proxies for architecture evaluation [1,37,42,44,66]. White et al [56] offer a comprehensive survey of performance predictors in NAS, and in the Appendix, we discuss related works in more detail.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, Kim et al ( 2022 ) explored a cell-based neural architecture search method on SNNs, but did not involve large-scale recurrent connections. Na et al ( 2022 ) introduced a spike-aware NAS framework called AutoSNN to investigate the impact of architectural components on SNNs' performance and energy efficiency. Overall, NAS for RSNNs is still rarely explored.…”
Section: Introductionmentioning
confidence: 99%