2021
DOI: 10.48550/arxiv.2106.02969
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

FedNL: Making Newton-Type Methods Applicable to Federated Learning

Abstract: Inspired by recent work of Islamov et al (2021), we propose a family of Federated Newton Learn (FedNL) methods, which we believe is a marked step in the direction of making second-order methods applicable to FL. In contrast to the aforementioned work, FedNL employs a different Hessian learning technique which i) enhances privacy as it does not rely on the training data to be revealed to the coordinating server, ii) makes it applicable beyond generalized linear models, and iii) provably works with general contr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
35
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 8 publications
(35 citation statements)
references
References 3 publications
0
35
0
Order By: Relevance
“…We allow both unbiased compressors, such as random sparsification (Rand-K) or random dithering, and contractive compressors, such as greedy sparsification (Top-K) or low-rank approximations (Rank-R). In the special case of choosing the standard basis, our method recovers FedNL [Safaryan et al, 2021]. Thus, basis learn can be viewed as a generalization of FedNL.…”
Section: Contributionsmentioning
confidence: 98%
See 2 more Smart Citations
“…We allow both unbiased compressors, such as random sparsification (Rand-K) or random dithering, and contractive compressors, such as greedy sparsification (Top-K) or low-rank approximations (Rank-R). In the special case of choosing the standard basis, our method recovers FedNL [Safaryan et al, 2021]. Thus, basis learn can be viewed as a generalization of FedNL.…”
Section: Contributionsmentioning
confidence: 98%
“…Motivated by these recent developments on distributed second-order methods with communication compression, we extend the results of FedNL [Safaryan et al, 2021] allowing even more aggressive compression for some applications.…”
mentioning
confidence: 98%
See 1 more Smart Citation
“…The operator is allowed to be randomized, and typically operates on models Khaled & Richtárik (2019) or on gradients Alistarh et al (2017); Beznosikov et al (2020), both of which can be described as vectors in R 𝑑 . Besides sparsification (Alistarh et al, 2018), typical examples of useful compression mechanisms include quantization (Alistarh et al, 2017;Horváth et al, 2019a) and low-rank approximation (Vogels et al, 2019;Safaryan et al, 2021).…”
Section: Ef21 With Bells and Whistlesmentioning
confidence: 99%
“…We emphasize that we do not assume 𝒞 to be unbiased. Hence, our theory works with the Top-𝑘 (Alistarh et al, 2018) and the Rank-𝑟 (Safaryan et al, 2021) compressors, for example.…”
Section: General Assumptionsmentioning
confidence: 99%