2024
DOI: 10.1109/tnnls.2022.3214573
|View full text |Cite
|
Sign up to set email alerts
|

Deep Class-Incremental Learning From Decentralized Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 47 publications
0
3
0
Order By: Relevance
“…Rehearsal-based methods tackle catastrophic forgetting either by keeping a small set of old training examples in memory (Tao et al 2020a;Dong et al 2021;Liu et al 2022;Yang et al 2022a,b) or using synthesized data produced by generative models (Shin et al 2017). By using the rehearsal buffer for knowledge distillation and regularization, rehearsal-based methods have achieved state-of-theart results on various benchmarks (Douillard et al 2022;Joseph et al 2022;Zhang et al 2022). However, the performance of rehearsal-based methods generally deteriorates with a smaller buffer size (Mai et al 2022).…”
Section: Related Workmentioning
confidence: 99%
“…Rehearsal-based methods tackle catastrophic forgetting either by keeping a small set of old training examples in memory (Tao et al 2020a;Dong et al 2021;Liu et al 2022;Yang et al 2022a,b) or using synthesized data produced by generative models (Shin et al 2017). By using the rehearsal buffer for knowledge distillation and regularization, rehearsal-based methods have achieved state-of-theart results on various benchmarks (Douillard et al 2022;Joseph et al 2022;Zhang et al 2022). However, the performance of rehearsal-based methods generally deteriorates with a smaller buffer size (Mai et al 2022).…”
Section: Related Workmentioning
confidence: 99%
“…Therefore, simply performing element-wise averaging of model parameters, as most FL algorithms adopt, would not produce an ideal global model to serve all clients 8 . In addition, some related schemes have been used in federated learning, e.g., data poisoning attack 9 and federated class-incremental learning 10 , 11 improve on the original federation averaging algorithm, respectively. But they mainly address the problem of model training in specific federated learning scenarios.…”
Section: Introductionmentioning
confidence: 99%
“…I N the recent years, deep learning has achieved remarkable success in a wide range of applications, including image classification [1], [2], speech recognition [3], [4] and machine translation [5], [6]. These high-capacity models benefit from capturing capture complex patterns of the underlying data distribution.…”
Section: Introductionmentioning
confidence: 99%