2023
DOI: 10.48550/arxiv.2301.05849
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Survey of Knowledge Distillation in Federated Edge Learning

Abstract: The increasing demand for intelligent services and privacy protection of mobile and Internet of Things (IoT) devices motivates the wide application of Federated Edge Learning (FEL), in which devices collaboratively train on-device Machine Learning (ML) models without sharing their private data. Limited by device hardware, diverse user behaviors and network infrastructure, the algorithm design of FEL faces challenges related to resources, personalization and network environments. Fortunately, Knowledge Distilla… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 35 publications
(112 reference statements)
0
4
0
Order By: Relevance
“…The clients then use their labeled and unlabeled data to train their respective student models, which are subsequently sent back to the server for aggregation. Last but not least, the study in [25] provides a review of the use of knowledge distillation in federated edge learning, which is a distributed learning approach that involves extending the FL paradigm to edge devices like smartphones and IoT devices. The review focuses on previous research on knowledge distillation in this context and highlights the potential benefits, challenges, and future directions of the proposed approaches.…”
Section: B Knowledge Distillation For Iot Traffic Classificationmentioning
confidence: 99%
“…The clients then use their labeled and unlabeled data to train their respective student models, which are subsequently sent back to the server for aggregation. Last but not least, the study in [25] provides a review of the use of knowledge distillation in federated edge learning, which is a distributed learning approach that involves extending the FL paradigm to edge devices like smartphones and IoT devices. The review focuses on previous research on knowledge distillation in this context and highlights the potential benefits, challenges, and future directions of the proposed approaches.…”
Section: B Knowledge Distillation For Iot Traffic Classificationmentioning
confidence: 99%
“…As noted in [25], knowledge distillation has emerged as an important technique for addressing challenges in federated edge learning. [23], [29] leverage knowledge distillation as a communication protocol for exchanging model representations among devices and the edge server, enabling communication-efficient training over heterogeneous models.…”
Section: Knowledge Distillation In Federated Edge Learningmentioning
confidence: 99%
“…In addition, we consider 4 different model architectures, where {A C 1 , A C 2 , A C 3 } are for clients, and A S is for the server, and the main configurations of four adopted models are shown in TABLE 4. It is worth noting that the model on the server does not contain the foremost Conv+Batch+ReLU layers to fit the training requirements of [24], [25]. Moreover, both client-side model homogeneity and heterogeneity are considered in our experiments.…”
Section: Modelsmentioning
confidence: 99%
See 1 more Smart Citation