2018
DOI: 10.48550/arxiv.1812.06127
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Federated Optimization in Heterogeneous Networks

Abstract: Federated Learning is a distributed learning paradigm with two key challenges that differentiate it from traditional distributed optimization: (1) significant variability in terms of the systems characteristics on each device in the network (systems heterogeneity), and (2) non-identically distributed data across the network (statistical heterogeneity). In this work, we introduce a framework, FedProx, to tackle heterogeneity in federated networks. FedProx can be viewed as a generalization and re-parametrization… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

6
492
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 280 publications
(498 citation statements)
references
References 17 publications
6
492
0
Order By: Relevance
“…4, forcing the same model to process all images may be a suboptimal approach. Using advanced strategies like FedProx [17] to combine the updates coming from different nodes does improve the performance compared to FedAvg, due to FedProx's ability to deal with non-i.i.d. data, but does not close the performance gap.…”
Section: Performance Evaluation and Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…4, forcing the same model to process all images may be a suboptimal approach. Using advanced strategies like FedProx [17] to combine the updates coming from different nodes does improve the performance compared to FedAvg, due to FedProx's ability to deal with non-i.i.d. data, but does not close the performance gap.…”
Section: Performance Evaluation and Discussionmentioning
confidence: 99%
“…Dealing with low-quality, non-i.i.d. dataset is the focus of several FL studies, including [5], [15]- [17], and the main strategy they use is assigning to learning nodes weights reflecting their data quality, thus, the contribution they can give to the learning process. FPL achieves the same objective through the junction layer: the values of the parameters therein -hence, the importance to assign to different data sourcesare found as a part of the DNN training process.…”
Section: The Flexible Parallel Learning Paradigmmentioning
confidence: 99%
See 1 more Smart Citation
“…As federated learning has raised lots of attention over the recent years, researchers also start to investigate building efficient libraries (He et al, 2020;Beutel et al, 2020) and systems (Bonawitz et al, 2019b;a;Hiessl et al, 2020). Several seminal work has also been done on designing better communication protocol (Konečnỳ et al, 2016), optimization algorithms (Li et al, 2018), and improving model robustness (Konstantinidis & Ramamoorthy, 2021). However, most of the existing research is targeting horizontal federated learning systems and vertical federated learning systems are still waiting to be tailored for industrial production.…”
Section: Related Workmentioning
confidence: 99%
“…The policies were tested on Synthetic(1,1) [15], [16] and Synthetic-IID [15], [16] datasets with logistic regression as the classification model. Synthetic(α, β) [15] is a dataset involving labelled feature vectors that can be non-identically or identically distributed across the clients. The variation in the underlying local models is decided by α while β introduces non-i.i.d.…”
Section: A Synthetic Datasetsmentioning
confidence: 99%