2024
DOI: 10.1109/tnnls.2022.3224252
|View full text |Cite
|
Sign up to set email alerts
|

A New Look and Convergence Rate of Federated Multitask Learning With Laplacian Regularization

Abstract: Non-Independent and Identically Distributed (non-IID) data distribution among clients is considered as the key factor that degrades the performance of federated learning (FL). Several approaches to handle non-IID data such as personalized FL and federated multi-task learning (FMTL) are of great interest to research communities. In this work, first, we formulate the FMTL problem using Laplacian regularization to explicitly leverage the relationships among the models of clients for multi-task learning. Then, we … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
25
0

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 38 publications
(39 citation statements)
references
References 35 publications
0
25
0
Order By: Relevance
“…A numerical comparison of test accuracy over five independent runs is shown in Table I. FLACC is compared to NoFed, FedAvg, two state-of-the-art personalized FL methods pFedMe [17], Per-FedAvg [20], and and three state-of-the-art clustered FL methods IFCA [24], FL+HC [21], and CFL [26]. For NoFed, each client learns only from its local data, and model averaging is not performed.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…A numerical comparison of test accuracy over five independent runs is shown in Table I. FLACC is compared to NoFed, FedAvg, two state-of-the-art personalized FL methods pFedMe [17], Per-FedAvg [20], and and three state-of-the-art clustered FL methods IFCA [24], FL+HC [21], and CFL [26]. For NoFed, each client learns only from its local data, and model averaging is not performed.…”
Section: Resultsmentioning
confidence: 99%
“…The second line of research is pluralistic and learns individually personalized models for each client. This approach is termed as personalized federated learning and can be achieved via multi-task learning [16], model regularization [17], contextualization [18], local fine tuning [19], and meta learning [20]. The third line of research is clustered FL, which assumes that groups of clients share one data distribution.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Full Model Personalization usually requires each client to train a personalized model and a global model, where the global model is used to prevent the personalized model from overfitting to its local data. It includes methods based on meta learning (Fallah et al, 2020), model mixture Deng et al, 2020;Mansour et al, 2020), global reguarlization (Li et al, 2021a), mean regularization (T Dinh et al, 2020; and clustering (Sattler et al, 2020;Ghosh et al, 2020). However, these methods induce high costs by training two full models in each client and communicating the full model.…”
Section: Related Workmentioning
confidence: 99%
“…Existing works in pFL commonly use full model personalization, where each client trains a personalized model as well as a copy of the global model from the server for regularization (Li et al, 2021a;T Dinh et al, 2020). However, these methods are parameter-expensive, leading to high computational and communicational costs, which is impractical for clients with limited computation resources and network bandwidth (Kairouz et al, 2021).…”
Section: Introductionmentioning
confidence: 99%