2019
DOI: 10.1007/978-3-030-11012-3_11
|View full text |Cite
|
Sign up to set email alerts
|

DeeSIL: Deep-Shallow Incremental Learning

Abstract: Incremental Learning (IL) is an interesting AI problem when the algorithm is assumed to work on a budget. This is especially true when IL is modeled using a deep learning approach, where two complex challenges arise due to limited memory, which induces catastrophic forgetting and delays related to the retraining needed in order to incorporate new classes. Here we introduce DeeSIL, an adaptation of a known transfer learning scheme that combines a fixed deep representation used as feature extractor and learning … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
49
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 40 publications
(49 citation statements)
references
References 17 publications
0
49
0
Order By: Relevance
“…The performance exhibited by our approach is comparable to that of generative methods, while we rely on less restrictive hypotheses than generative methods-since they need to have information about all unseen classes to generate the samples required to learn the discriminative models. An alternative with the generative approach is to use incremental learning systems [29,5] but it usually leads to a significant drop in performance. Hence, our proposal has a practical interest for real systems that aim to recognize unseen classes whose number increases regularly.…”
Section: Discussionmentioning
confidence: 99%
“…The performance exhibited by our approach is comparable to that of generative methods, while we rely on less restrictive hypotheses than generative methods-since they need to have information about all unseen classes to generate the samples required to learn the discriminative models. An alternative with the generative approach is to use incremental learning systems [29,5] but it usually leads to a significant drop in performance. Hence, our proposal has a practical interest for real systems that aim to recognize unseen classes whose number increases regularly.…”
Section: Discussionmentioning
confidence: 99%
“…Incremental learning methods attempt to follow the human cognition system, that is, learn concepts sequentially [32]. According to gradually extending acquired knowledge, incremental learning methods can adapt the data streams from different domains or tasks.…”
Section: B Incremental Learningmentioning
confidence: 99%
“…We study the common continual learning paradigm in which pre-training precedes continual learning [28,29,55,10,71,32,33,34,68,4]. Formally, given a pre-training dataset {(X i , y i )} ∈ A, with M images X i and their corresponding labels y i ∈ Y, a set of parameters θ are learned for a CNN using A in an offline manner, i.e., the learner can shuffle the data to simulate independent and identically distributed data and loop over it as many times as it desires.…”
Section: Problem Formulationmentioning
confidence: 99%
“…For all experiments, we use ResNet-18 as the CNN. For continual learning with the full-resolution ImageNet dataset, ResNet-18 has been adopted as the universal standard CNN architecture by the community [55,68,28,32,21,28,10,5,4].…”
Section: Algorithmsmentioning
confidence: 99%