2017
DOI: 10.48550/arxiv.1705.10467
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Federated Multi-Task Learning

Abstract: Federated learning poses new statistical and systems challenges in training machine learning models over distributed networks of devices. In this work, we show that multi-task learning is naturally suited to handle the statistical challenges of this setting, and propose a novel systems-aware optimization method, MOCHA, that is robust to practical systems issues. Our method and theory for the first time consider issues of high communication cost, stragglers, and fault tolerance for distributed multi-task learni… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
74
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 53 publications
(74 citation statements)
references
References 30 publications
0
74
0
Order By: Relevance
“…Heterogeneous neural architecture is one way to personalize the model in FL. For personalization, the primal-dual framework [29], clustering [26], fine-tuning with transfer learning [37], meta-learning [4], regularization-based method [7,16] are among the popular methods explored in the FL literature. Although these techniques achieve improved personalized performance, all of them use a pre-defined architecture for each client.…”
Section: Related Workmentioning
confidence: 99%
“…Heterogeneous neural architecture is one way to personalize the model in FL. For personalization, the primal-dual framework [29], clustering [26], fine-tuning with transfer learning [37], meta-learning [4], regularization-based method [7,16] are among the popular methods explored in the FL literature. Although these techniques achieve improved personalized performance, all of them use a pre-defined architecture for each client.…”
Section: Related Workmentioning
confidence: 99%
“…Despite fruitful research results on FL regarding advanced learning algorithms [7,8], data heterogeneity [9,10], personalization [11][12][13][14], fairness [15,16], system design [17][18][19] and privacypreserving frameworks [20][21][22] etc., there are only a few focusing on hyper-parameters tuning for FL [23,4,[24][25][26]3]. In [23], Dai et.al.…”
Section: Related Workmentioning
confidence: 99%
“…More recently, researchers have tried to remove the dependence on a global model for personalization by following a multi-task learning philosophy (Smith et al, 2017). Yet, such models can only handle simple linear formulations.…”
Section: Related Workmentioning
confidence: 99%