2019 IEEE Global Communications Conference (GLOBECOM) 2019
DOI: 10.1109/globecom38437.2019.9014272
|View full text |Cite
|
Sign up to set email alerts
|

PEFL: A Privacy-Enhanced Federated Learning Scheme for Big Data Analytics

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
43
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 59 publications
(43 citation statements)
references
References 15 publications
0
43
0
Order By: Relevance
“…In order to protect the privacy of the gradients from an untrusted server, Zhang et al [29] propose a privacy-enhanced FL scheme based on the additively homomorphic cryptosystem which enables applying functions on encrypted data without revealing the values of the data. A distributed selective SGD method is employed to achieve distributed encryptions and reduce the communication costs.…”
Section: B Privacy and Security Of Flmentioning
confidence: 99%
“…In order to protect the privacy of the gradients from an untrusted server, Zhang et al [29] propose a privacy-enhanced FL scheme based on the additively homomorphic cryptosystem which enables applying functions on encrypted data without revealing the values of the data. A distributed selective SGD method is employed to achieve distributed encryptions and reduce the communication costs.…”
Section: B Privacy and Security Of Flmentioning
confidence: 99%
“…Therefore, it is widely applied to parameter aggregation on servers. Zhang et al [11] proposed PEFL, a distributed machine learning privacy-preserving scheme based on homomorphic encryption. This scheme can perform computation directly using ciphertext to achieve secure aggregation.…”
Section: Cryptography-based Secure Aggregation Algorithm In Federated Learningmentioning
confidence: 99%
“…To prevent gradient leakage, users can rely on homomorphic encryption, function encryption, and other cryptographic methods to encrypt local models, and service providers do not have direct access to the plaintext of individual gradients. Zhang et al [11] proposed a distributed selective stochastic gradient descent algorithm combined with Paillier homomorphic encryption. In this scheme, a trusted third party (TTP) assigns keys to users and the server, and the server uses Paillier additive homomorphism to achieve secure gradient aggregation.…”
Section: Introductionmentioning
confidence: 99%
“…Yang et al [ 19 ] depicted the necessary abstraction, architecture, and various applications of federated learning from an overview perspective. FL technique earned a remarkable reputation in many pragmatic fields such as visual object detection [ 20 ], health records [ 21 ], big data analysis [ 22 ], control policies [ 23 ], and medical text extraction [ 24 ].…”
Section: Related Workmentioning
confidence: 99%