2021
DOI: 10.1109/tnnls.2020.3015958
|View full text |Cite
|
Sign up to set email alerts
|

Clustered Federated Learning: Model-Agnostic Distributed Multitask Optimization Under Privacy Constraints

Abstract: Federated learning (FL) is currently the most widely adopted framework for collaborative training of (deep) machine learning models under privacy constraints. Albeit its popularity, it has been observed that FL yields suboptimal results if the local clients' data distributions diverge. To address this issue, we present clustered FL (CFL), a novel federated multitask learning (FMTL) framework, which exploits geometric properties of the FL loss surface to group the client population into clusters with jointly tr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
284
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 487 publications
(285 citation statements)
references
References 18 publications
0
284
0
1
Order By: Relevance
“…Federated learning can address statistical and system heterogeneity issues since models are trained locally [26]. However, challenges still exist in dealing with non-IID data Many researchers have worked on training data clustering [27], multi-stage local training [28], and multi-task learning [26]. Also, some works [29], [30] focus on incentive mechanism design to motivate clients to participate in the machine learning jobs.…”
Section: Related Workmentioning
confidence: 99%
“…Federated learning can address statistical and system heterogeneity issues since models are trained locally [26]. However, challenges still exist in dealing with non-IID data Many researchers have worked on training data clustering [27], multi-stage local training [28], and multi-task learning [26]. Also, some works [29], [30] focus on incentive mechanism design to motivate clients to participate in the machine learning jobs.…”
Section: Related Workmentioning
confidence: 99%
“…Our proposed approach builds upon the concepts introduced by the aforementioned works and focuses on adding a privacypreserving layer using a variation of FL [10,11], namely CFL [9]. Spectral features which, although aggregated still retain privacysensitive information [17], are replaced with more privacy-aware, locally learned, neural network parameter updates.…”
Section: Relation To Prior Workmentioning
confidence: 99%
“…It is shown in [9,19] that for the cases where clients' data comes from different (incongruent) distributions, there is no single θ * that can optimally minimize the loss of all clients at the same time. For this reason, the authors suggest clustering the clients that have similar (congruent) distributions and training separate server models for each resulting cluster.…”
Section: Clustered Federated Learningmentioning
confidence: 99%
See 2 more Smart Citations