IEEE INFOCOM 2021 - IEEE Conference on Computer Communications 2021
DOI: 10.1109/infocom42981.2021.9488906
|View full text |Cite
|
Sign up to set email alerts
|

Device Sampling for Heterogeneous Federated Learning: Theory, Algorithms, and Implementation

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 86 publications
(23 citation statements)
references
References 23 publications
0
23
0
Order By: Relevance
“…For example, researchers have studied the model training performance under noisy channels [14], limited energy devices [12], limited bandwidth [13], quantization and sparsification [15], [16], and wireless aggregation of signals over the air [17]. Also, device sampling [18] and data sampling [19] has been topics of research. Furthermore, a part of literature focuses on adapting FedL for a variety of new technologies, such as unmanned aerial vehicles [20], [21], intelligent reflecting surfaces [22], and massive MIMO [23].…”
Section: B Related Workmentioning
confidence: 99%
“…For example, researchers have studied the model training performance under noisy channels [14], limited energy devices [12], limited bandwidth [13], quantization and sparsification [15], [16], and wireless aggregation of signals over the air [17]. Also, device sampling [18] and data sampling [19] has been topics of research. Furthermore, a part of literature focuses on adapting FedL for a variety of new technologies, such as unmanned aerial vehicles [20], [21], intelligent reflecting surfaces [22], and massive MIMO [23].…”
Section: B Related Workmentioning
confidence: 99%
“…We impose no constraint on client selection [40,43,45,52,76,81,88] or training data sampling [44,76] strategies, making it compatible with a mass of recent FL system literature.…”
Section: Federated Learningmentioning
confidence: 99%
“…Integration with the existing FL framework AutoFedNLP's trial groups are compatible with how existing FL frameworks manage clients for training efficiency, a key system component having received high research attention [40,44,45,52,76,81,88]. This is because the adapters and their configuration scheduler are intentionally designed to be decoupled from which device or data will be involved in per-round training.…”
Section: Configurator Algorithm In Detailmentioning
confidence: 99%
See 1 more Smart Citation
“…In this paper, we investigate the problem of machine unlearning in a more practical scenario, where data holders are collaboratively performing training and unlearning without sharing raw data. In particular, we target Federated Learning (FL) [11]- [17], a widely adopted privacy-aware collaborative learning framework. In FL, data holders train a model from their local data samples, and the server only aggregates data holders' local model updates for data privacy considerations [18].…”
Section: Introductionmentioning
confidence: 99%