2019 IEEE/CVF International Conference on Computer Vision (ICCV) 2019
DOI: 10.1109/iccv.2019.00067
|View full text |Cite
|
Sign up to set email alerts
|

IL2M: Class Incremental Learning With Dual Memory

Abstract: This paper presents a class incremental learning (IL) method which exploits fine tuning and a dual memory to reduce the negative effect of catastrophic forgetting in image recognition. First, we simplify the current fine tuning based approaches which use a combination of classification and distillation losses to compensate for the limited availability of past data. We find that the distillation term actually hurts performance when a memory is allowed. Then, we modify the usual class IL memory component. Simila… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

4
171
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 212 publications
(175 citation statements)
references
References 19 publications
4
171
0
Order By: Relevance
“…The architecture adaptation is data-driven by learning which neurons need to be trained and what is the maximum adaptation applied to these neurons using variational inference. The Incremental Learning With Dual Memory (IL2M) (Belouadah and Popescu, 2019 ) consists of two parts: (1) a deep learning model that is incrementally trained on new samples and a constant number of previous representative samples, and (2) an additional memory that stores previous task statistics which are periodically used to rectify the network. The Growing Dual Memory (GDM) approach of Parisi et al ( 2018 ) is an architectural and a generative replay approach.…”
Section: Related Workmentioning
confidence: 99%
“…The architecture adaptation is data-driven by learning which neurons need to be trained and what is the maximum adaptation applied to these neurons using variational inference. The Incremental Learning With Dual Memory (IL2M) (Belouadah and Popescu, 2019 ) consists of two parts: (1) a deep learning model that is incrementally trained on new samples and a constant number of previous representative samples, and (2) an additional memory that stores previous task statistics which are periodically used to rectify the network. The Growing Dual Memory (GDM) approach of Parisi et al ( 2018 ) is an architectural and a generative replay approach.…”
Section: Related Workmentioning
confidence: 99%
“…However, when there is a strong correlation between the additional classes, the results of this methods are far from ideal. Class Incremental Learning With Dual Memory (IL2M) [5] propose to construct a training set including exemplar and new class data to realize fine-tuning of previous networks. In test time, the results of classification are generated by rectifying the prediction scores of past classes based on some statistics.…”
Section: Related Workmentioning
confidence: 99%
“…Followed by benchmark protocol in other class incremental methods [5,53], the classes are arranged in a fixed order. The first two classes are selected to train an initial network at the beginning.…”
Section: Task Settingmentioning
confidence: 99%
See 2 more Smart Citations