2021
DOI: 10.48550/arxiv.2111.09785
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

DIVA: Dataset Derivative of a Learning Task

Abstract: We present a method to compute the derivative of a learning task with respect to a dataset. A learning task is a function from a training set to the validation error, which can be represented by a trained deep neural network (DNN). The "dataset derivative" is a linear operator, computed around the trained model, that informs how perturbations of the weight of each training sample affect the validation error, usually computed on a separate validation dataset. Our method, DIVA (Differentiable Validation) hinges … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 46 publications
(65 reference statements)
0
1
0
Order By: Relevance
“…These algorithms can be achieved by NTK tool [Novak et al 2019] easily. The approach [Dukler et al 2021] treats the instance weights as the outer variable and assumes the pretrained model with linear representation to yield a closed-form solution for the inner level task. Besides assuming the inner level as ridge regression and least squares, some works [Ghadimi and M. Wang 2018;…”
Section: Closed-form Updatementioning
confidence: 99%
“…These algorithms can be achieved by NTK tool [Novak et al 2019] easily. The approach [Dukler et al 2021] treats the instance weights as the outer variable and assumes the pretrained model with linear representation to yield a closed-form solution for the inner level task. Besides assuming the inner level as ridge regression and least squares, some works [Ghadimi and M. Wang 2018;…”
Section: Closed-form Updatementioning
confidence: 99%