ICC 2019 - 2019 IEEE International Conference on Communications (ICC) 2019
DOI: 10.1109/icc.2019.8761267
|View full text |Cite
|
Sign up to set email alerts
|

Towards Efficient and Privacy-Preserving Federated Deep Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
63
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 138 publications
(71 citation statements)
references
References 21 publications
0
63
0
Order By: Relevance
“…• Hao et al [57] reported that symmetric additively homomorphic encryption has excellent efficiency and could address computational and communication overhead faced by other public-key encryption schemes.…”
Section: Discussion and Learned Lessonsmentioning
confidence: 99%
See 2 more Smart Citations
“…• Hao et al [57] reported that symmetric additively homomorphic encryption has excellent efficiency and could address computational and communication overhead faced by other public-key encryption schemes.…”
Section: Discussion and Learned Lessonsmentioning
confidence: 99%
“…A recent solution based on encryption was proposed by Hao et al [57]. The process is simple, users perform local training over their private datasets, perturb and encrypt the resulting local gradients, then upload them to the cloud which computes the global encrypted gradients.…”
Section: A2) Server-assisted Collaborative Pp Model Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…A neural network is an excellent less-computationally sophisticated alternative to many systems that experience high computational overhead. The development in neural network research enables us to use more than one hidden layer learning by a network with more than one hidden layer is called deep learning, which can process a bulk amount of data and is suitable for big data (Hao et al, 2019). Recently, deep learning has been extensively applied in all areas of science and engineering for efficient handling of big data and substantial computational overhead.…”
Section: Privacy-preserving Methods For Big Datamentioning
confidence: 99%
“…Several communication efficient and privacy preservation distributed approaches have been proposed recently including Practical Secure Aggregation (PSA) [41], Federated Extreme Boosting (XGB) [42], Efficient and Privacy-Preserving Federated Deep Learning (EPFDL) [43] and Privacy-preserving collaborative learning (PPCL) [44] as listed in Table 3. Specifically, PSA and XGB utilise collaborative training in order to resist collusion among adversaries, but both approaches do not guarantee communication efficiency.…”
Section: Functional Comparisonmentioning
confidence: 99%