2020
DOI: 10.1007/978-3-030-43883-8_8
|View full text |Cite
|
Sign up to set email alerts
|

Online Continual Learning on Sequences

Abstract: Online continual learning (OCL) refers to the ability of a system to learn over time from a continuous stream of data without having to revisit previously encountered training samples. Learning continually in a single data pass is crucial for agents and robots operating in changing environments and required to acquire, fine-tune, and transfer increasingly complex representations from non-i.i.d. input distributions. Machine learning models that address OCL must alleviate catastrophic forgetting in which hidden … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(12 citation statements)
references
References 66 publications
0
9
0
Order By: Relevance
“…They also recommend some desiderata and guidelines for future CL research. [27] emphasizes the importance of online CL and discusses recent advances in this setting. Although these three surveys descriptively review the recent development of CL and provide practical guidelines, they do not perform any empirical comparison between methods.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…They also recommend some desiderata and guidelines for future CL research. [27] emphasizes the importance of online CL and discusses recent advances in this setting. Although these three surveys descriptively review the recent development of CL and provide practical guidelines, they do not perform any empirical comparison between methods.…”
Section: Related Workmentioning
confidence: 99%
“…We consider the supervised image classification problem with an online (potentially infinite) non-i.i.d stream of data, following the recent CL literature [17,18,27,14]. Formally, we define a data stream of unknown distributions D = {D 1 , .…”
Section: Problem Definition and Evaluation Settingsmentioning
confidence: 99%
“…We define and evaluate the proactive training for continuously training DL models. In a different setting than ours, this problem is similar to continual learning [21][22][23][24]. In that framework, a model is learning tasks one after another with the goal to learn the last task without catastrophic forgetting [25] that is losing the ability to perform on previous tasks.…”
Section: Related Workmentioning
confidence: 99%
“…In this section, we discuss the recent state-of-the-art OCL strategies shown in Fig. 2, that are developed by a combination of three basic CL strategies, i.e., regularization, architectural, and rehearsal [8]. We also summarize the key characteristics of these methods in Table . 1.…”
Section: Online Continual Learning Strategiesmentioning
confidence: 99%
“…A major challenge in realizing a real-world autonomous system capable of continuously learning and adapting over time is to prevent catastrophic forgetting. The learning model needs to maintain a balance between plasticity (the ability to adapt to new knowledge) and stability (the ability to retain prior knowledge) [7], [8]. Excessive plasticity can cause forgetting the prior learned information while learning a new task.…”
Section: Introductionmentioning
confidence: 99%