2020
DOI: 10.48550/arxiv.2005.05232
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On the Transferability of Winning Tickets in Non-Natural Image Datasets

Abstract: We study the generalization properties of pruned neural networks that are the winners of the lottery ticket hypothesis on datasets of natural images. We analyse their potential under conditions in which training data is scarce and comes from a non-natural domain. Specifically, we investigate whether pruned models that are found on the popular CIFAR-10/100 and Fashion-MNIST datasets, generalize to seven different datasets that come from the fields of digital pathology and digital heritage. Our results show that… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
6
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 18 publications
1
6
0
Order By: Relevance
“…We call lottery tickets sourced on a specific task bespoke lottery tickets. We confirm previous findings of ticket transferability [39,45], and we aim to analyze whether similarity of masks can trivially explain this property.…”
Section: Mask Similarity Across Taskssupporting
confidence: 85%
See 2 more Smart Citations
“…We call lottery tickets sourced on a specific task bespoke lottery tickets. We confirm previous findings of ticket transferability [39,45], and we aim to analyze whether similarity of masks can trivially explain this property.…”
Section: Mask Similarity Across Taskssupporting
confidence: 85%
“…On the other hand, lottery tickets have been shown to possess generalization properties that allow for their reuse across similar tasks, thus reducing the computational cost of finding dataset-dependent sparse sub-networks [38,45]. However, the mechanism that powers their transferability properties is not yet clearly understood, and is therefore subject to further study in this work.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Frankle et al ( , 2020a proposed late rewinding as a solution. Morcos et al (2019); Sabatelli, Kestemont, and Geurts (2020) showed that LTs trained on large datasets transfer to smaller ones, but not vice versa. Mostafa and Wang (2019) propose replacing low magnitude parameters with random connections and report improved generalization.…”
Section: Related Workmentioning
confidence: 99%
“…Soelen et al [24] show that the winning tickets from one task are transferable to another related task. Sabatelli et al [25] show further that the trained winning tickets can be transferred with minimal retraining on new tasks, and sometimes may even generalize better than models trained from scratch specifically for the new task.…”
Section: Introductionmentioning
confidence: 99%