2021
DOI: 10.1109/jsait.2021.3053220
|View full text |Cite
|
Sign up to set email alerts
|

CodedPrivateML: A Fast and Privacy-Preserving Framework for Distributed Machine Learning

Abstract: How to train a machine learning model while keeping the data private and secure? We present CodedPrivateML, a fast and scalable approach to this critical problem. CodedPrivateML keeps both the data and the model information-theoretically private, while allowing efficient parallelization of training across distributed workers. We characterize CodedPrivateML's privacy threshold and prove its convergence for logistic (and linear) regression. Furthermore, via experiments over Amazon EC2, we demonstrate that CodedP… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 53 publications
(23 citation statements)
references
References 35 publications
0
23
0
Order By: Relevance
“…Secure Machine Learning. There has been a flurry of recent results ( [61], [75], [57], [25], [44]) in the area of secure machine learning, both in the 2-party [17], [67], [34], [55], [49], [58], [84], as well as in the 3-party setting [72], [60], [79], [12]. The most relevant to our work are ABY 3 [60] and SecureNN [79] that both provide 3-party semi-honest secure computation protocols for a variety of neural network inference and training algorithms, with somewhat similar performance guarantees.…”
Section: Related Workmentioning
confidence: 99%
“…Secure Machine Learning. There has been a flurry of recent results ( [61], [75], [57], [25], [44]) in the area of secure machine learning, both in the 2-party [17], [67], [34], [55], [49], [58], [84], as well as in the 3-party setting [72], [60], [79], [12]. The most relevant to our work are ABY 3 [60] and SecureNN [79] that both provide 3-party semi-honest secure computation protocols for a variety of neural network inference and training algorithms, with somewhat similar performance guarantees.…”
Section: Related Workmentioning
confidence: 99%
“…This paper is one of the first attempts that propose a privacy-preserving machine learning solution using encryption and covers various architectural issues and problems such as the polynomial approximations or the fixed-depth of circuits. In CodedPrivateML [105], the authors train machine learning models (i.e., linear and logistic regression), using Shamir's secret sharing and speed it up with the Lagrange Coding. This paper proposes a solution based on cloud computing by distributing the workload to train the algorithms.…”
Section: Preceding Approachesmentioning
confidence: 99%
“…It uses data from different places to effectively train different neural network models, which has attracted a lot of attention. Previous studies [47][48][49][50][51][52][53][54][55][56][57] have improved the performance of distributed learning systems from different perspectives. In order to maximize the overall performance of tasks in the cluster, Zhang et al [47] design a scheduling algorithm to approximate the training performance of deep learning jobs.…”
Section: Distributed Learningmentioning
confidence: 99%