2021
DOI: 10.48550/arxiv.2111.14806
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Coarse-To-Fine Incremental Few-Shot Learning

Abstract: Different from fine-tuning models pre-trained on a large-scale dataset of preset classes, class-incremental learning (CIL) aims to recognize novel classes over time without forgetting pre-trained classes. However, a given model will be challenged by test images with finer-grained classes, e.g., a basenji is at most recognized as a dog. Such images form a new training set (i.e., support set) so that the incremental model is hoped to recognize a basenji (i.e., query) as a basenji next time. This paper formulates… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 44 publications
0
2
0
Order By: Relevance
“…Coarse and Fine Learning Coarse and fine learning has been an important topic in computer vision and machine learning. Numerous methods [2,7,8,26,27,29,35,40,42] and theoretical studies [10] have been proposed to address this problem, with the goal of leveraging coarse-grained labeled data to improve fine-grained recognition. On the one hand, several methods have been proposed to tackle the coarse and fine learning problem.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Coarse and Fine Learning Coarse and fine learning has been an important topic in computer vision and machine learning. Numerous methods [2,7,8,26,27,29,35,40,42] and theoretical studies [10] have been proposed to address this problem, with the goal of leveraging coarse-grained labeled data to improve fine-grained recognition. On the one hand, several methods have been proposed to tackle the coarse and fine learning problem.…”
Section: Related Workmentioning
confidence: 99%
“…For instance, Stretcu et al [26] proposed a coarse-to-fine curriculum learning method that can dynamically generate training sample sequences based on task difficulty and data distribution, thereby accelerating model convergence and improving generalization ability. Xiang et al [40] proposed a coarse-to-fine incremental few-shot learning method that can use coarse-grained labels to perform contrastive learning on the embedding space and then used fine-grained labels to normalize and freeze the classifier weights, thereby solving the class incremental problem. Sun et al [27] developed a dynamic metric learning method that can adaptively adjust the metric space according to different semantic scales, thereby improving the performance of multi-label classification and retrieval.…”
Section: Related Workmentioning
confidence: 99%