2022
DOI: 10.1109/tpami.2021.3132021
|View full text |Cite
|
Sign up to set email alerts
|

Bridging the Gap Between Few-Shot and Many-Shot Learning via Distribution Calibration

Abstract: A major gap between few-shot and many-shot learning is the data distribution empirically observed by the model during training. In few-shot learning, the learned model can easily become over-fitted based on the biased distribution formed by only a few training examples, while the ground-truth data distribution is more accurately uncovered in many-shot learning to learn a well-generalized model. In this paper, we propose to calibrate the distribution of these few-sample classes to be more unbiased to alleviate … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(13 citation statements)
references
References 26 publications
(37 reference statements)
0
13
0
Order By: Relevance
“…To evaluate the performance of the proposed information fusion rectification, the results of 5-way 1-shot and 5way 5-shot classification are compared with related image augmentation and other state-of-the-art meta learning methods, including DC 13 , MAP 14 , CQD 15 , LEO 22 , SIB 23 , E 3 BM 24 , S2M2 25 ,and feature extractor of each method is set as WideResNet.…”
Section: Comparison Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…To evaluate the performance of the proposed information fusion rectification, the results of 5-way 1-shot and 5way 5-shot classification are compared with related image augmentation and other state-of-the-art meta learning methods, including DC 13 , MAP 14 , CQD 15 , LEO 22 , SIB 23 , E 3 BM 24 , S2M2 25 ,and feature extractor of each method is set as WideResNet.…”
Section: Comparison Resultsmentioning
confidence: 99%
“…Zhang et al 12 utilize object detection algorithm to separate foreground and background of image, and then randomly combine the foreground and background of different images to generate more images. Distribution Calibration (DC) 13 uses distribution statistics (mean and covariance) of the base class data to calibrate the statistics of new class and extend the features of new class through sampling. Wu et al 14 propose to obtain the distribution statistics of new class through the maximum a posteriori (MAP) of base class and extend the features of new class through sampling.…”
Section: Data Augmentationmentioning
confidence: 99%
“…The IFR based feature augmentation method contains three important module: the Tukey’s ladder of powers transformation module, the nearest neighbor prototypes matching module and the information fusion rectification module. The Tukey’s ladder of powers transformation has proved to be effective in reducing the deviation of the distributions by DC 24 , MAP 25 , CQD 26 . Hence, the Tukey’s ladder of powers transformation is adopted in IFR based feature augmentation method.…”
Section: Methodsmentioning
confidence: 99%
“…Zhang et al 23 utilize object detection algorithm to separate foreground and background of image, and then randomly combine the foreground and background of different images to generate more images. Distribution Calibration (DC) 24 uses distribution statistics (mean and covariance) of the base class data to calibrate the statistics of new class and extend the features of new class through sampling. Wu et al 25 propose to obtain the distribution statistics of new class through the maximum a posteriori (MAP) of base class and extend the features of new class through sampling.…”
Section: Introductionmentioning
confidence: 99%
“…Specifically, FSCC freezes the parameters of the feature extractor of the pre-trained model. Then, based on distribution calibration ( Yang et al, 2021b ), FSCC retrains the classifier with a limited number of labeled macromolecules from novel classes. Distribution calibration is a kind of domain adaption method ( Sun and Saenko, 2016 ).…”
Section: Introductionmentioning
confidence: 99%