2021 the 5th International Conference on Advances in Artificial Intelligence (ICAAI) 2021
DOI: 10.1145/3505711.3505717
|View full text |Cite
|
Sign up to set email alerts
|

AdaFed: Performance-based Adaptive Federated Learning

Abstract: Federated Learning is a distributed and privacy-preserving machine learning technique that allows local clients to learn a model without sharing their own data by coordinating with a global server. In this work, we present the Adaptive Federated Learning (AdaFed) algorithm, which aims at improving the training performance of deep neural networks in Federated Learning settings by: (i) dynamically weighting the local models in the model averaging procedure; (ii) by adapting the loss function used by the federati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 31 publications
0
4
0
Order By: Relevance
“…In addition to FL solutions developed ad hoc for the particular application under consideration, such as the previous ones, theoretical solutions have been proposed in the last period, aimed at enhancing FedAvg. Among them, FedProx [85] allows devices with optimization characteristics strongly different from others to be included in the optimization process, AdaFed [86] replaces the averaging mechanism of clients with a dynamic and adaptive heuristic weighting mechanism based on client performance, FOCUS [87] replaces the averaging mechanism with a credibility mechanism based on Shapley values [88], DecFedAvg [89] and CFA-GE [90] represent the first attempts to decentralize FedAvg by allowing each client to communicate with its neighbors, and FedLCon [91], the first consensus-based FL algorithm.…”
Section: Federated Learningmentioning
confidence: 99%
“…In addition to FL solutions developed ad hoc for the particular application under consideration, such as the previous ones, theoretical solutions have been proposed in the last period, aimed at enhancing FedAvg. Among them, FedProx [85] allows devices with optimization characteristics strongly different from others to be included in the optimization process, AdaFed [86] replaces the averaging mechanism of clients with a dynamic and adaptive heuristic weighting mechanism based on client performance, FOCUS [87] replaces the averaging mechanism with a credibility mechanism based on Shapley values [88], DecFedAvg [89] and CFA-GE [90] represent the first attempts to decentralize FedAvg by allowing each client to communicate with its neighbors, and FedLCon [91], the first consensus-based FL algorithm.…”
Section: Federated Learningmentioning
confidence: 99%
“…end for 19: end for 20: set wi (t) = w i (t − 1) 21: return wi (t) to the server We report the pseudo-code for AdaFed (see Algorithm 2) and refer the reader to [2] for a more detailed discussion of the algorithm.…”
Section: B Background On Federated Learningmentioning
confidence: 99%
“…Figures 4 and 5 show that the two proposed algorithms exhibit a similar performance across all the communication rounds, with AdaFed converging slightly faster to the final value. we remark that, in federation with balanced and IID data, AdaFed is expected to perform very similarly to FedAvg , as its dynamic weight update has a greater effect on uneven data distributions [2]. For benchmarking purposes, in Figure 4 we also include a dashed line that represents the performance attained after 60 epochs by a single, centralized, server that trains the same neural network on the entirety of the data.…”
Section: Simulation 1 -Original Datamentioning
confidence: 99%
See 1 more Smart Citation
“…With the rapid spread of privatized smart devices [1][2][3][4] , huge amounts of personal privatized data are being generated and stored. In order to protect the clients' privacy, Federated Learning (FL) [5][6][7] is proposed, which can train a global model on the premise of protecting clients' privacy. Recently, FL has been Jingyi He, Biyao Gong, Jiadi Yang, Hai Wang, Pengfei Xu, and Tianzhang Xing are with the School of Information Science and Technology, Northwest University, Xi'an 710100, China.…”
Section: Introductionmentioning
confidence: 99%