2021
DOI: 10.48550/arxiv.2111.01549
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Overcoming Catastrophic Forgetting in Incremental Few-Shot Learning by Finding Flat Minima

Abstract: This paper considers incremental few-shot learning, which requires a model to continually recognize new categories with only a few examples provided. Our study shows that existing methods severely suffer from catastrophic forgetting, a well-known problem in incremental learning, which is aggravated due to data scarcity and imbalance in the few-shot setting. Our analysis further suggests that to prevent catastrophic forgetting, actions need to be taken in the primitive stagethe training of base classes instead … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 22 publications
0
1
0
Order By: Relevance
“…However, as the data-driven algorithms rely on the type, scale, and quality of training data, the coherence, generality, and adaptability of the algorithms across different tasks and environments are great challenges. The challenge for AIS concerned in this paper is the ability to remember previous tasks when learning new ones, known as catastrophic forgetting (Shi et al, 2021). Catastrophic forgetting refers to the phenomenon where a neural network loses previously learned information after training on subsequent tasks, resulting in a drastic performance drop on previous tasks (Serra et al, 2018).…”
Section: Introductionmentioning
confidence: 99%
“…However, as the data-driven algorithms rely on the type, scale, and quality of training data, the coherence, generality, and adaptability of the algorithms across different tasks and environments are great challenges. The challenge for AIS concerned in this paper is the ability to remember previous tasks when learning new ones, known as catastrophic forgetting (Shi et al, 2021). Catastrophic forgetting refers to the phenomenon where a neural network loses previously learned information after training on subsequent tasks, resulting in a drastic performance drop on previous tasks (Serra et al, 2018).…”
Section: Introductionmentioning
confidence: 99%