2022
DOI: 10.48550/arxiv.2205.11071
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Self-distilled Knowledge Delegator for Exemplar-free Class Incremental Learning

Abstract: Exemplar-free incremental learning is extremely challenging due to inaccessibility of data from old tasks. In this paper, we attempt to exploit the knowledge encoded in a previously trained classification model to handle the catastrophic forgetting problem in continual learning. Specifically, we introduce a so-called knowledge delegator, which is capable of transferring knowledge from the trained model to a randomly reinitialized new model by generating informative samples. Given the previous model only, the d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 38 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?