2020
DOI: 10.48550/arxiv.2007.05181
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Sample-based Regularization: A Transfer Learning Strategy Toward Better Generalization

Abstract: Training a deep neural network with a small amount of data is a challenging problem as it is vulnerable to overfitting. However, one of the practical difficulties that we often face is to collect many samples. Transfer learning is a cost-effective solution to this problem. By using the source model trained with a large-scale dataset, the target model can alleviate the overfitting originated from the lack of training data. Resorting to the ability of generalization of the source model, several methods proposed … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 14 publications
(21 reference statements)
0
2
0
Order By: Relevance
“…The original paper showed state of the art performance for DELTA on Caltech 256-30, however they used mostly the same datasets as the original L2-SP paper [64] and for the two additional datasets used they showed that L2-SP outperformed the baseline L2 regularization. It has since been shown that like L2-SP, DELTA can also hinder performance when the source and target datasets are less similar [12,45,53]. 3.…”
Section: Regularization Based Technique Advancesmentioning
confidence: 99%
See 1 more Smart Citation
“…The original paper showed state of the art performance for DELTA on Caltech 256-30, however they used mostly the same datasets as the original L2-SP paper [64] and for the two additional datasets used they showed that L2-SP outperformed the baseline L2 regularization. It has since been shown that like L2-SP, DELTA can also hinder performance when the source and target datasets are less similar [12,45,53]. 3.…”
Section: Regularization Based Technique Advancesmentioning
confidence: 99%
“…5. Sample-based regularization [45] proposes regularization using the distance between feature maps of pairs of inputs in the same class, as well as weight regularization. The model was tested using a ResNet-50 and transferring from ImageNet 1K and Places365 to a number of different, fine grained classification tasks.…”
Section: Regularization Based Technique Advancesmentioning
confidence: 99%