2023
DOI: 10.3390/electronics12102265
|View full text |Cite
|
Sign up to set email alerts
|

An Efficient Strategy for Catastrophic Forgetting Reduction in Incremental Learning

Abstract: Deep neural networks (DNNs) have made outstanding achievements in a wide variety of domains. For deep learning tasks, large enough datasets are required for training efficient DNN models. However, big datasets are not always available, and they are costly to build. Therefore, balanced solutions for DNN model efficiency and training data size have caught the attention of researchers recently. Transfer learning techniques are the most common for this. In transfer learning, a DNN model is pre-trained on a large e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 44 publications
0
1
0
Order By: Relevance
“…To mitigate catastrophic forgetting, the use of a memory buffer is studied [6,7] in the literature. This family of L3 methods stores a subset of samples from previous tasks in a memory buffer and replays interleaved with new samples.…”
Section: Introductionmentioning
confidence: 99%
“…To mitigate catastrophic forgetting, the use of a memory buffer is studied [6,7] in the literature. This family of L3 methods stores a subset of samples from previous tasks in a memory buffer and replays interleaved with new samples.…”
Section: Introductionmentioning
confidence: 99%