2022
DOI: 10.48550/arxiv.2211.04742
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Knowledge Distillation for Federated Learning: a Practical Guide

Abstract: Federated Learning (FL) enables the training of Deep Learning models without centrally collecting possibly sensitive raw data. This paves the way for stronger privacy guarantees when building predictive models. The most used algorithms for FL are parameter-averaging based schemes (e.g., Federated Averaging) that, however, have well known limits: (i) Clients must implement the same model architecture; (ii) Transmitting model weights and model updates implies high communication cost, which scales up with the num… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 42 publications
0
1
0
Order By: Relevance
“…Knowledge Distillation (KD) was introduced in [84] for model compression: it allows to transfer knowledge from a larger network (teacher) to a smaller one (student). It has been widely used in continual learning and recently has been increasingly employed in FL algorithms [162] to reduce catastrophic forgetting, tackle data heterogeneity, and enable model heterogeneity.…”
Section: Methodsmentioning
confidence: 99%
“…Knowledge Distillation (KD) was introduced in [84] for model compression: it allows to transfer knowledge from a larger network (teacher) to a smaller one (student). It has been widely used in continual learning and recently has been increasingly employed in FL algorithms [162] to reduce catastrophic forgetting, tackle data heterogeneity, and enable model heterogeneity.…”
Section: Methodsmentioning
confidence: 99%
“…To the best of our knowledge, this paper is the first work to investigate the application of knowledge distillation in federated edge learning. Different from existing surveys [33,40,21,26], we take the challenges faced by FEL as the main clue, introducing existing FEL approaches based on diverse forms of KD techniques and providing guidance for both future research directions and real deployment. Specifically, the reminders of this paper are organized as follows.…”
Section: Introductionmentioning
confidence: 99%