2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2020
DOI: 10.1109/cvpr42600.2020.01238
|View full text |Cite
|
Sign up to set email alerts
|

Meta-Learning of Neural Architectures for Few-Shot Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
55
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 111 publications
(55 citation statements)
references
References 14 publications
0
55
0
Order By: Relevance
“…Pasunuru et al [295], introduced a continual architecture search (CAS) approach enabling lifelong learning. In addition, as the core idea of auoML is to learn to learn, it is natural to find a growing body of research that combines meta-learning and autoML, particularly for NAS improvement [296,297]. AutoML has also been studied in few-shot learning scenarios, for instance, Elsken et al [297] applied NAS to few-shot learning to overcome the data scarcity, while they only search for the most promising architecture and optimize it to work on multiple few-shot learning tasks.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Pasunuru et al [295], introduced a continual architecture search (CAS) approach enabling lifelong learning. In addition, as the core idea of auoML is to learn to learn, it is natural to find a growing body of research that combines meta-learning and autoML, particularly for NAS improvement [296,297]. AutoML has also been studied in few-shot learning scenarios, for instance, Elsken et al [297] applied NAS to few-shot learning to overcome the data scarcity, while they only search for the most promising architecture and optimize it to work on multiple few-shot learning tasks.…”
Section: Discussionmentioning
confidence: 99%
“…In addition, as the core idea of auoML is to learn to learn, it is natural to find a growing body of research that combines meta-learning and autoML, particularly for NAS improvement [296,297]. AutoML has also been studied in few-shot learning scenarios, for instance, Elsken et al [297] applied NAS to few-shot learning to overcome the data scarcity, while they only search for the most promising architecture and optimize it to work on multiple few-shot learning tasks. Recently, the idea of unsupervised autoML has begun to be explored, Liu et al [298] proposed a general problem setup, namely unsupervised neural architecture search (UnNAS), to explore whether labels are necessary for NAS.…”
Section: Discussionmentioning
confidence: 99%
“…Meta-training across multiple datasets may lead to improved cross-task generalization of architectures [131]. Finally, one can also define NAS meta-objectives to train an architecture suitable for few-shot learning [243]. Similarly to fast-adapting initial condition meta-learning approaches such as MAML [19], one can train good initial architectures [130] or architecture priors [131] that are easy to adapt towards specific tasks.…”
Section: Neural Architecture Search (Nas)mentioning
confidence: 99%
“…Currently, active research areas include the use of metalearning techniques to improve the sample efficiency of AutoML-based approaches [212,213], development of better search strategies [142,143] and computationally efficient techniques for model generation [214,215]. In the foreseeable future, however, better NAS algorithms and improvements in hardware performance-especially cloud-based computational resources-will enable the development of NAS techniques that can be used in challenging domains to solve challenging machine vision problems.…”
Section: Emerging Trends and Future Research Directionsmentioning
confidence: 99%