2021 IEEE/CVF International Conference on Computer Vision (ICCV) 2021
DOI: 10.1109/iccv48922.2021.01480
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble Attention Distillation for Privacy-Preserving Federated Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
37
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 73 publications
(37 citation statements)
references
References 30 publications
0
37
0
Order By: Relevance
“…In addition, FedDG [13] enhances the generalization ability of the FL framework on the unseen datasets via Fourier transform-based image synthesis and episodic learning strategies. Furthermore, [40] proposed a distillation-based FL method without sharing the model parameters, which further enhances data safety. Although these methods are effective in many medical imaging scenarios, they have not considered the weighting strategies for the global aggregation and local training, which is crucial for FL MS segmentation.…”
Section: B Federated Learningmentioning
confidence: 99%
“…In addition, FedDG [13] enhances the generalization ability of the FL framework on the unseen datasets via Fourier transform-based image synthesis and episodic learning strategies. Furthermore, [40] proposed a distillation-based FL method without sharing the model parameters, which further enhances data safety. Although these methods are effective in many medical imaging scenarios, they have not considered the weighting strategies for the global aggregation and local training, which is crucial for FL MS segmentation.…”
Section: B Federated Learningmentioning
confidence: 99%
“…This simple paradigm suffers from performance degradation when there exists data heterogeneity [20,25]. Numerous studies have been conducted for label space heterogeneity, i.e., class distributions are imbalanced across different clients, by regularizing local update with proximal term [26], personalizing client models [2,8,37,27], utilizing shared local data [44,30,10], introducing additional proxy datasets [24,29,11], or performing data-free knowledge distillation [32] in the input space [13,42,43] or the feature space [15,48]. However, there are only limited studies addressing the heterogeneity in feature space, i.e., non-IID features.…”
Section: Related Workmentioning
confidence: 99%
“…Numerous research papers have addressed data heterogeneity (i.e. non-IID data among local clients) in FL [1,7,13,23,31,39,41], such as improve client sampling fairness [27], adaptive optimization [9,28,37,38], and correct the local updation [16,20,33]. Also, federated learning had been extended in real life applications [8,24].…”
Section: Federated Learningmentioning
confidence: 99%