Proceedings of the 2017 ACM International Conference on Management of Data 2017
DOI: 10.1145/3035918.3064047
|View full text |Cite
|
Sign up to set email alerts
|

Bolt-on Differential Privacy for Scalable Stochastic Gradient Descent-based Analytics

Abstract: While significant progress has been made separately on analytics systems for scalable stochastic gradient descent (SGD) and private SGD, none of the major scalable analytics frameworks have incorporated differentially private SGD. There are two inter-related issues for this disconnect between research and practice: (1) low model accuracy due to added noise to guarantee privacy, and (2) high development and runtime overhead of the private algorithms. This paper takes a first step to remedy this disconnect and p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

4
180
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 161 publications
(184 citation statements)
references
References 27 publications
4
180
0
Order By: Relevance
“…Many differentially private ERM algorithms cannot be applied to L 1 regularized classification, such as ObjPert [1], [2], OutPert [40], PVP and DVP [5], PSGD [16], and RSGD [7]. Therefore, we compare our proposed algorithms with these baselines: DP-SGD [4], DP-ADMM [25], ADMM-objP [28], and Non-Private approach.…”
Section: Baselinesmentioning
confidence: 99%
“…Many differentially private ERM algorithms cannot be applied to L 1 regularized classification, such as ObjPert [1], [2], OutPert [40], PVP and DVP [5], PSGD [16], and RSGD [7]. Therefore, we compare our proposed algorithms with these baselines: DP-SGD [4], DP-ADMM [25], ADMM-objP [28], and Non-Private approach.…”
Section: Baselinesmentioning
confidence: 99%
“…EASGD helps reduce the communication cost by allowing sites to perform multiple iterations (each iteration is one pass of the local data) before sending the updated factor matrices. We further extend the local optimization updates using permutation-based SGD (P-SGD), a practical form of SGD [29]. In P-SGD, instead of randomly sampling one instance from the tensor at a time, the nonzero elements are rst shu ed within the tensor.…”
Section: Local Factors Updatementioning
confidence: 99%
“…Finally, we implement several differentially private machine learning algorithms in Duet which have never before been automatically verified by a language-based tool, and we present experimental results which demonstrate the benefits of Duet's language design in terms of accuracy of trained machine learning models. privacy provides a robust solution to this problem, and as a result, a number of differentially private algorithms have been developed for machine learning [3,13,17,28,42,50,51,54].Few practical approaches exist, however, for automatically proving that a general-purpose program satisfies differential privacy-an increasingly desirable goal, since many machine learning pipelines are expressed as programs that combine existing algorithms with custom code. Enforcing differential privacy for a new program currently requires a new, manually-written privacy proof.…”
mentioning
confidence: 99%
“…Finally, we implement several differentially private machine learning algorithms in Duet which have never before been automatically verified by a language-based tool, and we present experimental results which demonstrate the benefits of Duet's language design in terms of accuracy of trained machine learning models. privacy provides a robust solution to this problem, and as a result, a number of differentially private algorithms have been developed for machine learning [3,13,17,28,42,50,51,54].…”
mentioning
confidence: 99%
See 1 more Smart Citation