2019
DOI: 10.48550/arxiv.1911.00222
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Federated Learning with Differential Privacy: Algorithms and Performance Analysis

Abstract: Federated learning (FL), as a manner of distributed machine learning, is capable of significantly preserving clients' private data from being exposed to external adversaries. Nevertheless, private information can still be divulged by analyzing on the differences of uploaded parameters from clients, e.g., weights trained in deep neural networks. In this paper, to effectively prevent information leakage, we propose a novel framework based on the concept of differential privacy (DP), in which artificial noises ar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 28 publications
0
9
0
Order By: Relevance
“…where the second equality is obtained from (26). By using the expression in (36), computing the estimated transmitted signal x(i) is not required at each iteration.…”
Section: Detection Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…where the second equality is obtained from (26). By using the expression in (36), computing the estimated transmitted signal x(i) is not required at each iteration.…”
Section: Detection Methodsmentioning
confidence: 99%
“…There also exist recent studies that seek to optimize federated learning over wireless com-munication systems [14]- [26]. In these studies, the physical characteristics of wireless networks are considered to improve the performance of federated learning under a practical scenario.…”
Section: Introductionmentioning
confidence: 99%
“…Precautions like adding noise or quantization can somewhat reduce the risk. However, some experiments [318] have shown that there exists trade-offs between privacy and performance. Therefore, it is essential to understand this trade-off better and propose methods that can improve both simultaneously.…”
Section: ) Statistical Heterogeneity and Personalizationmentioning
confidence: 99%
“…Therefore, it is essential to understand this trade-off better and propose methods that can improve both simultaneously. Differential privacy may be a valuable tool along this line [318]. iv.…”
Section: ) Statistical Heterogeneity and Personalizationmentioning
confidence: 99%
“…Following McMahan et al [23], many researchers are working on improving the federated learning with more efficient parameter transmission and higher-level privacy-preserving. To improve privacy security, Wei et al [36] propose a federated learning framework based on differential privacy, in which artificial noises are added to the local parameters of participants before the model aggregation. To improve communication efficiency, Reisizadeh et al [27] propose a communication-efficient federated learning method with periodic averaging and quantization.…”
Section: Introductionmentioning
confidence: 99%