2021 19th International Symposium on Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks (WiOpt) 2021
DOI: 10.23919/wiopt52861.2021.9589061
|View full text |Cite
|
Sign up to set email alerts
|

CFedAvg: Achieving Efficient Communication and Fast Convergence in Non-IID Federated Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
32
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 29 publications
(34 citation statements)
references
References 4 publications
2
32
0
Order By: Relevance
“…Consequently, we do not observe communication savings by performing multiple local updates at the clients, except in the special case when σ G = 0 (see Table 1). Similar observations have been made for minimization Yang et al (2021); Jhunjhunwala et al (2022) and very recently for minimax problems Yang et al (2022a). • In the absence of multiple local updates (i.e., τ i = 1 for all i) and with full participation (P = n), the resulting error O…”
Section: Nonsupporting
confidence: 76%
See 2 more Smart Citations
“…Consequently, we do not observe communication savings by performing multiple local updates at the clients, except in the special case when σ G = 0 (see Table 1). Similar observations have been made for minimization Yang et al (2021); Jhunjhunwala et al (2022) and very recently for minimax problems Yang et al (2022a). • In the absence of multiple local updates (i.e., τ i = 1 for all i) and with full participation (P = n), the resulting error O…”
Section: Nonsupporting
confidence: 76%
“…Since its introduction, FL has been an active area of research, with some remarkable successes Li et al (2020); . Research has shown practical benefits of, and provided theoretical justifications for commonly used practical techniques, such as, multiple local updates at the clients Stich (2018); Khaled et al (2020); Koloskova et al (2020); , partial client participation Yang et al (2021), communication compression Hamer et al (2020; Chen et al (2021). Further, impact of heterogeneity in the clients' local data Zhao et al (2018); Sattler et al (2019), as well as their system capabilities ; Mitra et al (2021) has been studied.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Much effort has been made in optimizing FL and covers a variety of perspectives including communication [23], [24], update rules [25], [26], [27], [28], flexible aggregation [4], [29] and personalization [30], [31]. The control of device participation is imperative in crossdevice FL scenarios [32], [33] where the quality of local data is uncontrollable and the clients show varied value for the training task [6]. However, little attention has been paid to the problems caused by low-quality data and their impact on FL's efficiency and effectiveness.…”
Section: Related Workmentioning
confidence: 99%
“…Federated learning (FL) is a powerful distributed training paradigm for modern large-scale machine learning [1,2,10,11,13,16,22,33,[35][36][37]. FL leverages a large number of workers to collaboratively learn a global model.…”
Section: Introductionmentioning
confidence: 99%