2023
DOI: 10.1109/jstsp.2022.3224597
|View full text |Cite
|
Sign up to set email alerts
|

FedBKD: Heterogenous Federated Learning via Bidirectional Knowledge Distillation for Modulation Classification in IoT-Edge System

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(12 citation statements)
references
References 40 publications
0
8
0
Order By: Relevance
“…Recently serval works [23,24,33,34] has been proposed to implement data-free knowledge distillation for FL. FedGKD [34] prevents the local model drift by guiding local model training through knowledge distillation between historical global and local models.…”
Section: Knowledge Distillation In Federated Learningmentioning
confidence: 99%
“…Recently serval works [23,24,33,34] has been proposed to implement data-free knowledge distillation for FL. FedGKD [34] prevents the local model drift by guiding local model training through knowledge distillation between historical global and local models.…”
Section: Knowledge Distillation In Federated Learningmentioning
confidence: 99%
“…Recent trends suggest that KD has great potential to apply to various learning processes in FEL as an important tool for knowledge transfer or model collaborative training in diversity-constrained mobile edge networks. Specifically, the technical characteristics of KD meet the core demands of FEL, and the roles it can play include but not limited to compressing large-scale edge models for on-device deployment [25], transferring local adaptive knowledge to on-device models for personalization [39,17,44], and helping establish novel FL frameworks for enabling heterogeneous device supports [30,43,41]. Representative works that apply KD in FEL are summarized in Table 1, 2.…”
Section: Why Concern Knowledge Distillation In Federated Edgementioning
confidence: 99%
“…Edge-end Collaboration FD [14] FedGEMS [6] Limited Computation FedGKT [10] FedBKD [30] Heterogeneous Computation CMFD [34] Decentralization…”
Section: Model Representationmentioning
confidence: 99%
See 2 more Smart Citations