2022
DOI: 10.48550/arxiv.2205.13147
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Matryoshka Representations for Adaptive Deployment

Abstract: Learned representations are a central component in modern ML systems, serving a multitude of downstream tasks. When training such representations, it is often the case that computational and statistical constraints for each downstream task are unknown. In this context, rigid fixed-capacity representations can be either over or under-accommodating to the task at hand. This leads us to ask: can we design a flexible representation that can adapt to multiple downstream tasks with varying computational resources? O… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 58 publications
0
1
0
Order By: Relevance
“…As suggested in previous studies [31], we train all the models using the efficient dataloaders of FFCV [32]. We train the models for 40 epochs with the batch size of 512 on ImageNet-1K, and for 88 epochs with the batch size of 512 on ImageNet-100.…”
Section: B Setup For Training Image Classifiersmentioning
confidence: 99%
“…As suggested in previous studies [31], we train all the models using the efficient dataloaders of FFCV [32]. We train the models for 40 epochs with the batch size of 512 on ImageNet-1K, and for 88 epochs with the batch size of 512 on ImageNet-100.…”
Section: B Setup For Training Image Classifiersmentioning
confidence: 99%