2021 International Joint Conference on Neural Networks (IJCNN) 2021
DOI: 10.1109/ijcnn52387.2021.9534361
|View full text |Cite
|
Sign up to set email alerts
|

Memory-Efficient Semi-Supervised Continual Learning: The World is its Own Replay Buffer

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(10 citation statements)
references
References 6 publications
0
10
0
Order By: Relevance
“…Some variants of supervised CL exist that alleviate the need for annotated data. The reduction of annotations can apply for restricted access to the task labels as in task-agnostic CL [48,49], or by reducing the labels' availability as in continual active learning [50,51], or in semi-supervised continual learning, [52]. As described in Sec.…”
Section: Towards a Generalization Of The Default Settingmentioning
confidence: 99%
“…Some variants of supervised CL exist that alleviate the need for annotated data. The reduction of annotations can apply for restricted access to the task labels as in task-agnostic CL [48,49], or by reducing the labels' availability as in continual active learning [50,51], or in semi-supervised continual learning, [52]. As described in Sec.…”
Section: Towards a Generalization Of The Default Settingmentioning
confidence: 99%
“…CL with missing labels. While most CL research focuses on the supervised setting, recent works consider CL where few or no labels are available [17,18,19]. However, none of them deals with the OGCL setting except the STAM architecture [20], where the authors present an online clustering suited for unsupervised OGCL.…”
Section: Continual Learningmentioning
confidence: 99%
“…Even though other semi-supervised CL methods exist, none respects the OSSGCL setting and thus cannot be used in the comparison [14,18,26].…”
Section: Baselinesmentioning
confidence: 99%
“…However, recent works challenge some of those common assumptions such as fixed label [6,2], uncorrelated data stream [5,42], labeled new data [16], or even no metadata about tasks and unavailability of information about the future [11].…”
Section: Questioning Our Assumptionsmentioning
confidence: 99%
“…As well-annotated data eases the training on a task, the definition of tasks and the continual evaluation can be more clearly investigated. Although note that, while the most common approach in continual image classification is based on full supervision, semi-supervised approaches also exist [42].…”
Section: Image Classificationmentioning
confidence: 99%