2020
DOI: 10.48550/arxiv.2006.07002
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Double Double Descent: On Generalization Errors in Transfer Learning between Linear Regression Tasks

Abstract: We study the transfer learning process between two linear regression problems. An important and timely special case is when the regressors are overparameterized and perfectly interpolate their training data. We examine a parameter transfer mechanism whereby a subset of the parameters of the target task solution are constrained to the values learned for a related source task. We analytically characterize the generalization error of the target task in terms of the salient factors in the transfer learning archite… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
3

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(16 citation statements)
references
References 12 publications
(26 reference statements)
0
16
0
Order By: Relevance
“…However, TOPML goes well beyond the linear regression task. Overparameterization naturally arises in diverse ML tasks, such as classification (e.g., Muthukumar et al, 2020a), subspace learning for dimensionality reduction (Dar et al, 2020), data generation (Luzi et al, 2021), and dictionary learning for sparse representations (Sulam et al, 2020). In addition, overparameterization arises in various learning settings that are more complex than elementary fully supervised learning: unsupervised and semi-supervised learning (Dar et al, 2020), transfer learning (Dar and Baraniuk, 2020), pruning of learned models (Chang et al, 2021), and others.…”
Section: Contents Of This Papermentioning
confidence: 99%
See 4 more Smart Citations
“…However, TOPML goes well beyond the linear regression task. Overparameterization naturally arises in diverse ML tasks, such as classification (e.g., Muthukumar et al, 2020a), subspace learning for dimensionality reduction (Dar et al, 2020), data generation (Luzi et al, 2021), and dictionary learning for sparse representations (Sulam et al, 2020). In addition, overparameterization arises in various learning settings that are more complex than elementary fully supervised learning: unsupervised and semi-supervised learning (Dar et al, 2020), transfer learning (Dar and Baraniuk, 2020), pruning of learned models (Chang et al, 2021), and others.…”
Section: Contents Of This Papermentioning
confidence: 99%
“…The study by Dar et al (2020) on overparameterized subspace learning was one of the first to explore interpolation phenomena beyond the realm of regression and classification. Dar et al (2020) start from considering an overparameterized version of a linear subspace fitting problem, which is commonly addressed via principal component analysis (PCA) of the training data. Recall that PCA is an unsupervised task, in contrast to the supervised nature of regression and classification.…”
Section: Overparameterized Subspace Learning Via Pcamentioning
confidence: 99%
See 3 more Smart Citations