2022
DOI: 10.1016/j.neucom.2021.08.146
|View full text |Cite
|
Sign up to set email alerts
|

A two-phase half-async method for heterogeneity-aware federated learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 9 publications
0
2
0
Order By: Relevance
“…Some techniques can be implemented either at the client side or the server side, like a personalized approach that, for instance, can be implemented at the server side by keeping a record for each client to provide a personalized model [83] or at the client side where each client can have their personalized trained model locally and share a general model [87]. An adaptive approach also can be performed on the server side, where the work in [48] proves that using a fixed batch size can degrade the model performance since the data distribution and size differ between clients, so it proposes a batch adaption technique to determine the suitable batch size for each client; [99] proposes an adaptive local epoch technique to avoid overfitting the model, by decreasing the local epoch value after a certain iteration based on the global model performance. The adaptive approach can be implemented on the client side, where in [49,95], the clients adapt their learning rate based on the global model received.…”
Section: Discussionmentioning
confidence: 99%
“…Some techniques can be implemented either at the client side or the server side, like a personalized approach that, for instance, can be implemented at the server side by keeping a record for each client to provide a personalized model [83] or at the client side where each client can have their personalized trained model locally and share a general model [87]. An adaptive approach also can be performed on the server side, where the work in [48] proves that using a fixed batch size can degrade the model performance since the data distribution and size differ between clients, so it proposes a batch adaption technique to determine the suitable batch size for each client; [99] proposes an adaptive local epoch technique to avoid overfitting the model, by decreasing the local epoch value after a certain iteration based on the global model performance. The adaptive approach can be implemented on the client side, where in [49,95], the clients adapt their learning rate based on the global model received.…”
Section: Discussionmentioning
confidence: 99%
“…Based on Federated Learning Federated learning, a distributed machine learning paradigm, allows training models of scattered data on largescale edge or mobile devices without the need to collect raw data [30], which effectively mitigates unnecessary bandwidth loss, and enhances data privacy and legitimizing [31]. Palihawadana et al [28] performed local clustering for customers with similar gradients, and then conducted further global aggregation.…”
Section: Intelligent Optimization Model Of Cross-border E-commerce Op...mentioning
confidence: 99%