GLOBECOM 2020 - 2020 IEEE Global Communications Conference 2020
DOI: 10.1109/globecom42002.2020.9322199
|View full text |Cite
|
Sign up to set email alerts
|

Differentially Private Aircomp Federated Learning with Power Adaptation Harnessing Receiver Noise

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
35
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 46 publications
(35 citation statements)
references
References 11 publications
0
35
0
Order By: Relevance
“…Recent studies have investigated different training aspects including personalization (i.e., multi-task learning) [53], robustness guarantees [54], [55], and training over dynamic topologies [56]. To further improve the data privacy against the attacks inverting model parameters into raw data [57], [58], various privacy-preserving methods have been investigated, such as injecting fine-tuned noise into model parameters via a differential privacy mechanism [31], [59]- [61] and mixing model parameters over the air via analog transmissions [62], [63]. Still, one critical issue of FL is that its communication overhead is proportional to the number of model parameters.…”
Section: Federated Learning (Fl)mentioning
confidence: 99%
See 1 more Smart Citation
“…Recent studies have investigated different training aspects including personalization (i.e., multi-task learning) [53], robustness guarantees [54], [55], and training over dynamic topologies [56]. To further improve the data privacy against the attacks inverting model parameters into raw data [57], [58], various privacy-preserving methods have been investigated, such as injecting fine-tuned noise into model parameters via a differential privacy mechanism [31], [59]- [61] and mixing model parameters over the air via analog transmissions [62], [63]. Still, one critical issue of FL is that its communication overhead is proportional to the number of model parameters.…”
Section: Federated Learning (Fl)mentioning
confidence: 99%
“…In this regard, a joint design of SL (Sec. 5.1) that is robust against non-IID data distributions [82], [194] and feature interpolation and averaging via the Mixup data augmentation (Sec. 5.3) with heterogeneous FoVs and frame rates improving energy efficiency is considered.…”
Section: Heteromodal Sl For Mmwave Channel Predictionmentioning
confidence: 99%
“…While FL is designed for training over homogeneous agents with a common objective, recent studies have extended the focus towards personalization (i.e., multitask learning) [25], training over dynamic topologies [26] and robustness guarantees [27], [28]. In terms of improving data privacy against malicious attackers, various privacy-preserving methods including injecting fine-tuned noise into model parameters via a differential privacy mechanism [29]- [32] and mixing model parameters over the air via analog transmissions [33], [34] have been recently investigated. Despite of the advancements in FL design, one main drawback in the design of FL is that its communication overhead is proportional to the number of model parameters calling for the design of communication-efficient FL.…”
Section: B Distributed Learning Over Wireless Networkmentioning
confidence: 99%
“…and also in medical domain (Rajendran et al, 2021;Kerkouche et al, 2021;Choudhury et al, 2019;Ge et al, 2020). Researchers also investigated the scope of the differentially private algorithm in several applications (Zhao et al, 2020;Koda et al, 2020;Hu et al, 2020;Chen et al, 2018). However, for sequence tagging tasks, the applicability of the FL framework along with differential privacy (DP) is yet to be explored.…”
Section: Introductionmentioning
confidence: 99%