2020 16th International Conference on Mobility, Sensing and Networking (MSN) 2020
DOI: 10.1109/msn50589.2020.00038
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic Resource Allocation for Hierarchical Federated Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
179
0
1

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 86 publications
(180 citation statements)
references
References 11 publications
0
179
0
1
Order By: Relevance
“…Significant attention has been paid towards developing federated optimization techniques. Such work has focused on various aspects, including communication-efficiency (Konečnỳ et al, 2016;McMahan et al, 2017;Basu et al, 2019;Laguel et al, 2021), data and systems heterogeneity (Li et al, 2020aKarimireddy et al, 2020b;Hsu et al, 2019;Karimireddy et al, 2020a;Li et al, 2020b;Li and Wang, 2019), and fairness (Li et al, 2020c;Hu et al, 2020). We provide a description of some relevant methods in Section 2, and defer readers to recent surveys such as (Kairouz et al, 2021) and (Li et al, 2020a) for additional background.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Significant attention has been paid towards developing federated optimization techniques. Such work has focused on various aspects, including communication-efficiency (Konečnỳ et al, 2016;McMahan et al, 2017;Basu et al, 2019;Laguel et al, 2021), data and systems heterogeneity (Li et al, 2020aKarimireddy et al, 2020b;Hsu et al, 2019;Karimireddy et al, 2020a;Li et al, 2020b;Li and Wang, 2019), and fairness (Li et al, 2020c;Hu et al, 2020). We provide a description of some relevant methods in Section 2, and defer readers to recent surveys such as (Kairouz et al, 2021) and (Li et al, 2020a) for additional background.…”
Section: Related Workmentioning
confidence: 99%
“…One critical issue in FL is fairness across clients, as minimizing (1) may disadvantage some clients (Mohri et al, 2019;Li et al, 2020c). Intuitively, large-cohort training methods may be better suited for ensuring fairness, since a greater fraction of the population is allowed to contribute to the model at each round.…”
Section: Fairness Concernsmentioning
confidence: 99%
“…doing federated NAS, and tuning initialization-based meta-learning algorithms such as Reptile and MAML. Lastly, any work on FL comes with privacy and fairness risks due its frequent use of sensitive data; thus any application of our work must consider tools being developed by the community for mitigating such issues [32,37].…”
Section: Discussionmentioning
confidence: 99%
“…First, our model connects to models incentivizing parity in test accuracy in federated learning. (Li et al, 2019) proposes a variant of a global objective criterion to induce parity in test accuracy between clients in heterogeneous settings. While this literature also studies the supply side, i.e., who receives models, the paper implicitly assumes that the preference for a high-quality model is homogeneous across clients and tries to induce parity without any monetary transfers.…”
Section: Literature Reviewmentioning
confidence: 99%
“…In contrast to several contributions on incentive design for federated learning (Chen et al, 2020;Jiao et al, 2019;Li et al, 2019), we assume that clients submit models trained on their entire datasetwith no need to incentivize them to do so. Under this assumption, we design a system to collect resources from clients.…”
Section: Introductionmentioning
confidence: 99%