2020
DOI: 10.48550/arxiv.2007.13518
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

FedML: A Research Library and Benchmark for Federated Machine Learning

Chaoyang He,
Songze Li,
Jinhyun So
et al.

Abstract: Federated learning is a rapidly growing research field in the machine learning domain. Although considerable research efforts have been made, existing libraries cannot adequately support diverse algorithmic development (e.g., diverse topology and flexible message exchange), and inconsistent dataset and model usage in experiments make fair comparisons difficult. In this work, we introduce FedML, an open research library and benchmark that facilitates the development of new federated learning algorithms and fair… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
159
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
4
1

Relationship

2
8

Authors

Journals

citations
Cited by 107 publications
(160 citation statements)
references
References 108 publications
1
159
0
Order By: Relevance
“…In this section, we conduct extensive experiments to verify the efficacy of the proposed FedSpa. Our implementation of FedSpa is based on an open-source FL simulator FedML [27]. We fix the dense ratio of FedSpa (DST), FedSpa (RSM), and the final dense ratio of Fed-SubAvg both to 0.5 (i.e., 50% of parameters are pruned) in our main evaluation.…”
Section: Methodsmentioning
confidence: 99%
“…In this section, we conduct extensive experiments to verify the efficacy of the proposed FedSpa. Our implementation of FedSpa is based on an open-source FL simulator FedML [27]. We fix the dense ratio of FedSpa (DST), FedSpa (RSM), and the final dense ratio of Fed-SubAvg both to 0.5 (i.e., 50% of parameters are pruned) in our main evaluation.…”
Section: Methodsmentioning
confidence: 99%
“…All our experiments are based on a non-IID data distribution among FL clients. We have used latent Dirichlet Distribution (LDA), which is a common data distribution in FL to generate non-IID data across clients [10], [39].…”
Section: Methodsmentioning
confidence: 99%
“…As federated learning has raised lots of attention over the recent years, researchers also start to investigate building efficient libraries (He et al, 2020;Beutel et al, 2020) and systems (Bonawitz et al, 2019b;a;Hiessl et al, 2020). Several seminal work has also been done on designing better communication protocol (Konečnỳ et al, 2016), optimization algorithms (Li et al, 2018), and improving model robustness (Konstantinidis & Ramamoorthy, 2021).…”
Section: Related Workmentioning
confidence: 99%