ESANN 2022 Proceedings 2022
DOI: 10.14428/esann/2022.es2022-4
|View full text |Cite
|
Sign up to set email alerts
|

Tutorial - Continual Learning beyond classification

Abstract: Continual Learning (CL, sometimes also termed incremental learning) is a flavor of machine learning where the usual assumption of stationary data distribution is relaxed or omitted. When naively applying, e.g., DNNs in CL problems, changes in the data distribution can cause the so-called catastrophic forgetting (CF) effect: an abrupt loss of previous knowledge. Although many significant contributions to enabling CL have been made in recent years, most works address supervised (classification) problems. This ar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 68 publications
(82 reference statements)
0
4
0
Order By: Relevance
“…Furthermore, unlike conventional statistical methods that are static, ML can improve over time when new data are provided, a process known as incremental learning. 39,40 These findings demonstrate that artificial intelligence models outperform conventional statistical methods in predicting MSFN and encourage the use of ML to provide individualized risk assessment.…”
Section: Discussionmentioning
confidence: 76%
“…Furthermore, unlike conventional statistical methods that are static, ML can improve over time when new data are provided, a process known as incremental learning. 39,40 These findings demonstrate that artificial intelligence models outperform conventional statistical methods in predicting MSFN and encourage the use of ML to provide individualized risk assessment.…”
Section: Discussionmentioning
confidence: 76%
“…3) Incremental Learning: Typical deep-learning approaches suffer from catastrophic forgetting when the model is trained on new data. In contrast, incremental learning approaches [20], [21], [22] aim to retain the performance on old categories when new categories are added incrementally. Some FSOD approaches also incorporate incremental learning techniques.…”
Section: Related Work On Training With Limited Datamentioning
confidence: 99%
“…'Concept drift', a term used in the field of machine learning, means that the statistical characteristics of a target variable change over time [45,46]. If the concept drift occurs in the ST ANN, re-training using the entire learning data may not reflect the concept drift.…”
Section: Incremental Learning Strategiesmentioning
confidence: 99%
“…Figure 5 shows an example of the process of updating from t − 1 to t. To find the proper length of the window is challenging. For example, a short length of the sliding window can lead to a big difference and high variance in the next sequence whereas a long one leads to heavy computational load and decreased reactivity of the system [46,51]. To determine an optimal window size in between the two extremes, an empirical experiment that tests error statistics depending on the update cycle from the 1st to the 14th with a one-day interval has been conducted [51,52].…”
Section: Incremental Learning Strategiesmentioning
confidence: 99%