2019
DOI: 10.1007/978-3-030-20876-9_1
|View full text |Cite
|
Sign up to set email alerts
|

Revisiting Distillation and Incremental Classifier Learning

Abstract: One of the key differences between the learning mechanism of humans and Artificial Neural Networks (ANNs) is the ability of humans to learn one task at a time. ANNs, on the other hand, can only learn multiple tasks simultaneously. Any attempts at learning new tasks incrementally cause them to completely forget about previous tasks. This lack of ability to learn incrementally, called Catastrophic Forgetting, is considered a major hurdle in building a true AI system. In this paper, our goal is to isolate the tru… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

3
50
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 38 publications
(54 citation statements)
references
References 23 publications
3
50
0
1
Order By: Relevance
“…Since none of these networks has been proposed in a classincremental scenario, to be as fair as possible we fine-tune them using all the data from the new classes and the limited exemplar set P available. We also compare our proposal with the basic iCaRL version (that is, Single Task and Single Classifier) and two more strategies proposed in [20], referred to as S-Classifier and GS-Classifier. Again, to ensure a fair comparison, we use XceptionNet as feature extractor for all these methods.…”
Section: Gan-detection Resultsmentioning
confidence: 99%
“…Since none of these networks has been proposed in a classincremental scenario, to be as fair as possible we fine-tune them using all the data from the new classes and the limited exemplar set P available. We also compare our proposal with the basic iCaRL version (that is, Single Task and Single Classifier) and two more strategies proposed in [20], referred to as S-Classifier and GS-Classifier. Again, to ensure a fair comparison, we use XceptionNet as feature extractor for all these methods.…”
Section: Gan-detection Resultsmentioning
confidence: 99%
“…Second, the Nearest-Mean-of-Exemplars (NME) classifier, rather than deep neural networks, is used to learn through back-propagation. Javed et al proposed a dynamic threshold moving (DTM) [4] algorithm. In general, threshold moving is often an effective tool for solving data imbalance problems.…”
Section: B Deep Learning Approachesmentioning
confidence: 99%
“…To address these problems, researchers have proposed various solutions [4]- [7]. However, the problem of catastrophic forgetting persists.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In the family of the single-network methods, previous works have explored the regularization methods [1,4,14,18,33], the parameter isolation methods [21,22] and the memory rehearsal methods [3,5,12,20,27]. The regularization methods leverage a penalty term in the loss function to regularize the parameters when updating for new tasks.…”
Section: Introductionmentioning
confidence: 99%