2022
DOI: 10.1007/978-3-031-20044-1_25
|View full text |Cite
|
Sign up to set email alerts
|

Improving Few-Shot Learning Through Multi-task Representation Learning Theory

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 16 publications
0
2
0
Order By: Relevance
“…Multi-task learning approaches may provide a good representation for few-shot learning [6,11]. [11] proposed a theoretical analysis of learning a common good representation between source and target task with multi-learning, aiming to reveal the maximum extent of sample size reduction.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Multi-task learning approaches may provide a good representation for few-shot learning [6,11]. [11] proposed a theoretical analysis of learning a common good representation between source and target task with multi-learning, aiming to reveal the maximum extent of sample size reduction.…”
Section: Related Workmentioning
confidence: 99%
“…[11] proposed a theoretical analysis of learning a common good representation between source and target task with multi-learning, aiming to reveal the maximum extent of sample size reduction. [6] explores the framework of multi-task representation (MTR) learning, which aims to leverage source tasks to acquire a representation that reduces the sample complexity of solving a target task. The aforementioned work points out that a representation is crucial for connecting the source and target tasks, but not provides an approach to directly evaluate the benefits of a source task to a target few-shot task.…”
Section: Related Workmentioning
confidence: 99%