2022
DOI: 10.48550/arxiv.2205.00671
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Jack and Masters of All Trades: One-Pass Learning of a Set of Model Sets from Foundation Models

Abstract: For deep learning, size is power. Massive neural nets trained on broad data for a spectrum of tasks are at the forefront of artificial intelligence. These foundation models or 'Jacks of All Trades' (JATs), when fine-tuned for downstream tasks, are gaining importance in driving deep learning advancements. However, environments with tight resource constraints, changing objectives and intentions, or varied task requirements, could limit the real-world utility of a singular JAT. Hence, in tandem with current trend… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
references
References 46 publications
0
0
0
Order By: Relevance