2019
DOI: 10.48550/arxiv.1910.10252
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Federated Evaluation of On-device Personalization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
92
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 60 publications
(93 citation statements)
references
References 0 publications
0
92
0
Order By: Relevance
“…The benefit of this personalization has been shown for the language model of a virtual smartphone keyboard. [65] One personalized federated averaging algorithm, called Per-FedAvg [16] uses model-agnostic metalearning [17] methods to find an initial shared model that users can adapt to their local data with comparably little training. Mansour et al [44] propose three different approaches for the personalization of models: user clustering, data interpolation, and model interpolation.…”
Section: Specialization In Federated Learningmentioning
confidence: 99%
“…The benefit of this personalization has been shown for the language model of a virtual smartphone keyboard. [65] One personalized federated averaging algorithm, called Per-FedAvg [16] uses model-agnostic metalearning [17] methods to find an initial shared model that users can adapt to their local data with comparably little training. Mansour et al [44] propose three different approaches for the personalization of models: user clustering, data interpolation, and model interpolation.…”
Section: Specialization In Federated Learningmentioning
confidence: 99%
“…Recently, various approaches have been proposed to realize personalized FL with homogeneous local model structure, which can be categorized into three types according to the number of global models applied in the server, i.e., single global model, multiple global models and no global model. single global model type is a close variety of conventional FL, e.g., FedAvg [1], that combine global model optimization process with additional local model customization, and consist of four different kinds of approaches: local fine-tuning [20][21][22][23], regularization (e.g., pFedMe [6], L2SGD [7,24], Ditto [25]), hybrid local and global models [11,26,27] and meta learning [9,28]. All of these pFL methods apply a single global model, and thus limit the customized level of the local model at the client side.…”
Section: Related Workmentioning
confidence: 99%
“…The first type of solutions include proposals such as Transfer Learning techniques to adapt a pre-trained model over a public dataset to a bunch of devices [36], although it can also be applied without needing a public dataset [37]. Another idea, performed in [38,39] is adapting algorithms from the Model-Agnostic Meta-Learning (MAML) setting, such as Reptile, to a federated setting.…”
Section: Client-level Personalizationmentioning
confidence: 99%
“…As far as we are concerned, this strategy has only been performed in centralized settings. Nonetheless, in a decentralized setting each participant could Changes in the input space through clients Domain transformation Domains with particular features [62,63,64] Domain factorization [65,66,67,68] Personalization [36,37,38,39,45,56] Domain adaptation [69,70,71] Dissimilarity methods [72,73,74,75] Sample reweighting [76,77] Generative adversarial networks [78,79,80,81,82] Figure 1: Classification of the different approaches that are able to solve the problem of spatial heterogeneity in the input spaces.…”
Section: Changes In the Input Space Throughout Clientsmentioning
confidence: 99%
See 1 more Smart Citation