2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2019
DOI: 10.1109/cvpr.2019.01199
|View full text |Cite
|
Sign up to set email alerts
|

Task Agnostic Meta-Learning for Few-Shot Learning

Abstract: Meta-learning approaches have been proposed to tackle the few-shot learning problem. Typically, a meta-learner is trained on a variety of tasks in the hopes of being generalizable to new tasks. However, the generalizability on new tasks of a meta-learner could be fragile when it is over-trained on existing tasks during meta-training phase. In other words, the initial model of a meta-learner could be too biased towards existing tasks to adapt to new tasks, especially when only very few examples are available to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
112
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 357 publications
(126 citation statements)
references
References 8 publications
(4 reference statements)
1
112
0
Order By: Relevance
“…The development of the method is of great significance to the optimization-based meta learning methods. Recently, an expanded task-agnostic meta learning algorithm is proposed to enhance the generalizability of meta-learner towards a variety of tasks, which achieves outstanding performance on few-shot classification and reinforcement learning tasks [179].…”
Section: Optimization In Meta Learningmentioning
confidence: 99%
“…The development of the method is of great significance to the optimization-based meta learning methods. Recently, an expanded task-agnostic meta learning algorithm is proposed to enhance the generalizability of meta-learner towards a variety of tasks, which achieves outstanding performance on few-shot classification and reinforcement learning tasks [179].…”
Section: Optimization In Meta Learningmentioning
confidence: 99%
“…Gradient descent based MAML [3] 48.70 ± 1.84% 63.11 ± 0.92% Reptile [33] 49.97 ± 0.32% 65.99 ± 0.58% meta-SGD [20] 50.47 ± 1.87% 64.03 ± 0.94% LEO [6] 61.76 ± 0.15% 77.46 ± 0.12% MTL [7] 61.20 ± 1.80% 75.50 ± 0.80% TAML [21] 51.73 ± 1.88% 66.05 ± 0.85%…”
Section: Metric Basedmentioning
confidence: 99%
“…Based on MAML, Probabilistic model-agnostic meta-learning [17] introduces probabilistic into model to represent initial data, Kim [18] and Gupta [19] proposed Bayesian way into MAML to establish relationship between post result and pre data by Bayesian prior probability. Meta-SGD [20] and Task-Agnostic Meta-Learning(TAML) [21] try to make meta-learner learn how to learn better rather than only remember initial knowledge.…”
Section: Introductionmentioning
confidence: 99%
“…These systems can offer strong generalization to a corresponding target set. A successful exploitation of the above k-shot learning cases is provided by meta-learning techniques which can be used to deliver effective solutions [6].In this work, we propose a new classification model, which is based on zero-shot philosophy, named MAME-ZsL. The significant advantages of the proposed algorithm is that it reduces computational cost and training time; it avoids potential overfitting by enhancing the learning of features which do not cause issues of exploding or diminishing gradients; and it offers an improved training stability, high generalization performance and remarkable classification accuracy.…”
mentioning
confidence: 99%
“…These systems can offer strong generalization to a corresponding target set. A successful exploitation of the above k-shot learning cases is provided by meta-learning techniques which can be used to deliver effective solutions [6].…”
mentioning
confidence: 99%