2023
DOI: 10.1360/ssi-2021-0190
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive federated learning algorithm based on evolution strategies

Abstract: 联邦学习可以看作是分布式机器学习的一种特殊形式, 也是未来的发展方向 [4] , 二者都使用了分 散的数据集和分布式的模型训练. 相比分布式机器学习系统 [5] , 联邦学习旨在保护参与者的数据隐私, 并放宽了对私有数据的类型和分布的约束. 联邦学习的研究始于 Federated SGD (FedSGD) [6] , 该方法 将随机梯度下降 (stochastic gradient descent, SGD) 直接应用于联邦学习中. 客户端在每轮进行一次梯 度计算后, 将得到的梯度用于更新服务器上的全局模型. 受 FedSGD 思想的启发, Federated Averaging (FedAvg) [7] 使得用户在上传更新之前可以进行多次梯度下降, 并将训练好的模型参数而非梯度上传 到服务器. 该方法显著提高了通信效率, 被认为是联邦学习算法中的典型代表.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 22 publications
(18 reference statements)
0
1
0
Order By: Relevance
“…Clients finetune their models with the global model to resolve client heterogeneity issues. Gong [45] introduced an adaptive FL algorithm based on evolutionary strategy. Each client is treated as an individual in the evolutionary strategy, adapting to generate different personalized sub-models through global optimization.…”
Section: B Federated Optimization For Communication Cost and Data Het...mentioning
confidence: 99%
“…Clients finetune their models with the global model to resolve client heterogeneity issues. Gong [45] introduced an adaptive FL algorithm based on evolutionary strategy. Each client is treated as an individual in the evolutionary strategy, adapting to generate different personalized sub-models through global optimization.…”
Section: B Federated Optimization For Communication Cost and Data Het...mentioning
confidence: 99%