2021 17th International Conference on Mobility, Sensing and Networking (MSN) 2021
DOI: 10.1109/msn53354.2021.00043
|View full text |Cite
|
Sign up to set email alerts
|

FedHe: Heterogeneous Models and Communication-Efficient Federated Learning

Abstract: Federated learning (FL) facilitates edge devices to cooperatively train a global shared model while maintaining the training data locally and privately. However, a common but impractical assumption in FL is that the participating edge devices possess the same required resources and share identical global model architecture. In this study, we propose a novel FL method called Federated Intermediate Layers Learning (FedIN), supporting heterogeneous models without utilizing any public dataset. The training models … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(9 citation statements)
references
References 28 publications
0
5
0
Order By: Relevance
“…The client selection scheme is used to select the clients that can contribute more to enhance the global model, which results in a reduction in the communication rounds [147,185,191]. Using asynchronous communication can enhance the global model performance by allowing the aggregation of the received model without waiting for all clients [146,171,211]. Select model update is a technique that uploads the trained model that can help model coverage and ignore irrelevant updates [157,189].…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…The client selection scheme is used to select the clients that can contribute more to enhance the global model, which results in a reduction in the communication rounds [147,185,191]. Using asynchronous communication can enhance the global model performance by allowing the aggregation of the received model without waiting for all clients [146,171,211]. Select model update is a technique that uploads the trained model that can help model coverage and ignore irrelevant updates [157,189].…”
Section: Discussionmentioning
confidence: 99%
“…Quantization [142][143][144][145]148,152,156,158,168,169,174,176,183,184,188,[191][192][193]198,199] Sparsification [140,141,151,153,155,165,174,186,200,202,204] Client Selection [147,166,172,185,191,198,207] Asynchronous [146,171,190,203,211] Two-Level Aggregation [164,175,180,182,185] Select Model Updates [149,157,…”
Section: Techniques Studies Referencedmentioning
confidence: 99%
See 1 more Smart Citation
“…FedHe [ 33 ]: This method does not require a public dataset. It also uses the average logit information of each category as a loss to assist client model training.…”
Section: Methodsmentioning
confidence: 99%
“…To address this, the transfer of single per-class representations is discussed. Hin and Edith present FedHe [39], which, similar to previous approaches, allows for different model architectures per device. Contrary to FedMD or Cronus, FedHe does not use a public dataset for distillation.…”
Section: Mixture Of Distillation and Fedavgmentioning
confidence: 99%