2020
DOI: 10.1007/978-3-030-58951-6_20
|View full text |Cite
|
Sign up to set email alerts
|

PrivColl: Practical Privacy-Preserving Collaborative Machine Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
32
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
1
1

Relationship

2
4

Authors

Journals

citations
Cited by 28 publications
(32 citation statements)
references
References 36 publications
0
32
0
Order By: Relevance
“…Secure aggregation typically employs cryptographic mechanisms such as homomorphic encryption (HE) [7,31,40] and/or secure multiparty computation (MPC) [4,10,11,30,34,47] to securely evaluate the gradient โˆ‡๐น (๐‘ค ๐‘˜ , ๐œ‰ ๐‘™,๐‘˜ ) without revealing local data. With โˆ‡๐น (๐‘ค ๐‘˜ , ๐œ‰ ๐‘™,๐‘˜ ), all the participants can thus take a gradient descent step by Equation 2.…”
Section: Secure Aggregationmentioning
confidence: 99%
See 4 more Smart Citations
“…Secure aggregation typically employs cryptographic mechanisms such as homomorphic encryption (HE) [7,31,40] and/or secure multiparty computation (MPC) [4,10,11,30,34,47] to securely evaluate the gradient โˆ‡๐น (๐‘ค ๐‘˜ , ๐œ‰ ๐‘™,๐‘˜ ) without revealing local data. With โˆ‡๐น (๐‘ค ๐‘˜ , ๐œ‰ ๐‘™,๐‘˜ ), all the participants can thus take a gradient descent step by Equation 2.…”
Section: Secure Aggregationmentioning
confidence: 99%
“…In the traditional FL, every participant owns its local training dataset, and updates the same global model ๐‘ค ๐‘˜ via a parameter server using its local model/gradients. The local gradients โˆ‡๐น (๐‘ค ๐‘˜ , ๐œ‰ ๐‘™,๐‘˜ ) can be protected via either secure aggregation [4,30,31,34,40,47] or differential privacy mechanisms [2,8,21,42,44,48]. This process can be decentralized by replacing the parameter server with a peer-to-peer communication mechanism [22,39].…”
Section: Cgd Optimizationmentioning
confidence: 99%
See 3 more Smart Citations