2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) 2020
DOI: 10.1109/cvprw50498.2020.00131
|View full text |Cite
|
Sign up to set email alerts
|

Rehearsal-Free Continual Learning over Small Non-I.I.D. Batches

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
34
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 41 publications
(38 citation statements)
references
References 18 publications
0
34
0
Order By: Relevance
“…In general, this approach shows satisfactory performance in problems that involve few tasks, however, when the number of tasks increases, problems such as accumulated drifting in weight values and interference among them, make this approach difficult to scale. Alternatives to decrease the interference between tasks are presented in [35], [39], here the authors propose to completely freeze previously trained weights eliminating interference but inhibiting information transfer between tasks.…”
Section: A Catastrophic Forgettingmentioning
confidence: 99%
“…In general, this approach shows satisfactory performance in problems that involve few tasks, however, when the number of tasks increases, problems such as accumulated drifting in weight values and interference among them, make this approach difficult to scale. Alternatives to decrease the interference between tasks are presented in [35], [39], here the authors propose to completely freeze previously trained weights eliminating interference but inhibiting information transfer between tasks.…”
Section: A Catastrophic Forgettingmentioning
confidence: 99%
“…Even though these scenarios propose alternative learning settings, none of them really address and discuss the assumptions and constraints that characterize the class-incremental scenario. In Section 3, we will present few notable exceptions (Stojanov et al, 2019;Lomonaco et al, 2020;Thai et al, 2021) that work on continual learning scenarios for classification based on repetition of previously seen classes. We now turn the attention to some of the limitations caused by the no repetition constraint of classincremental.…”
Section: Class-incremental Scenariosmentioning
confidence: 99%
“…Their scenario with repeated exposures allowed to bridge the gap with the joint training performances without using any explicit replay of previous patterns, but only by leveraging the natural replay occurring in the environment. The work done by Lomonaco et al (2020) proposes a flexible setup for CIR (see Figure 1 for a depiction of the resulting scenario). The authors based their experiments on the New Instances and Classes (NIC) continual learning scenario together with the CORe50 dataset, both introduced in Lomonaco and Maltoni (2017).…”
Section: Class-incremental With Repetition Scenariosmentioning
confidence: 99%
See 2 more Smart Citations