2021
DOI: 10.48550/arxiv.2104.10949
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

CryptGPU: Fast Privacy-Preserving Machine Learning on the GPU

Abstract: We introduce CRYPTGPU, a system for privacypreserving machine learning that implements all operations on the GPU (graphics processing unit). Just as GPUs played a pivotal role in the success of modern deep learning, they are also essential for realizing scalable privacy-preserving deep learning. In this work, we start by introducing a new interface to losslessly embed cryptographic operations over secret-shared values (in a discrete domain) into floating-point operations that can be processed by highly-optimiz… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 40 publications
0
5
0
Order By: Relevance
“…Therefore, as one of the major components, we assume the existence of an underlying secure outsourced database scheme. Typically, secure outsourced databases can be implemented according to different architectural settings, such as the models utilizing server-aided MPC [6,7,47,64,79], homomorphic encryption [22], symmetric searchable encryption [3,9,20,26,35,48,78] or trusted hardware [32,70,81,89]. For the ease of demonstration, we focus exclusively on the outsourced databases built upon the server-aided MPC model, where a set of data owners secretly share their data to two untrusted but non-colluding servers S 0 and S 1 .…”
Section: Framework Componentsmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, as one of the major components, we assume the existence of an underlying secure outsourced database scheme. Typically, secure outsourced databases can be implemented according to different architectural settings, such as the models utilizing server-aided MPC [6,7,47,64,79], homomorphic encryption [22], symmetric searchable encryption [3,9,20,26,35,48,78] or trusted hardware [32,70,81,89]. For the ease of demonstration, we focus exclusively on the outsourced databases built upon the server-aided MPC model, where a set of data owners secretly share their data to two untrusted but non-colluding servers S 0 and S 1 .…”
Section: Framework Componentsmentioning
confidence: 99%
“…There have been a series of efforts under the literature of secure outsourcing databases. Existing solutions utilize bucketization [40][41][42], predicate encryption [59,75], property and order preserving encryption [2,9,12,13,66,68,69], symmetric searchable encryption (SSE) [3,11,20,26,35,45,46,48,50,67,78], functional encryption [15,74], oblivious RAM [6,24,28,43,65,89], multi-party secure computation (MPC) [6,7,14,79], trusted execution environments (TEE) [32,70,81,87] and homomorphic encryption [16,22,34,72]. These designs differ in the types of supported queries and the provided security guarantees.…”
Section: Related Workmentioning
confidence: 99%
“…As a follow-up work of PrivPy, Fan et al [171] focus on the privacy-preserving principal component analysis (PCA) via demonstrating an end-to-end optimization of a data mining algorithm to run on the mixed-protocol MPC framework. To further improve the efficiency, CRYPTGPU [172] is proposed to accelerate the mixed-protocol MPC computation via GPU. Specifically, CRYPTGPU introduces a new interface to losslessly embed cryptographic operations over secret-shared values into floating-point operations that highly-optimized CUDA kernels can process for linear algebra.…”
Section: Mixed-protocol Approachmentioning
confidence: 99%
“…A series of recent works have made encouraging progress towards improving system efficiency of privacy-preserving MLaaS (Demmler, Schneider, and Zohner 2015;Liu et al 2017;Mohassel and Zhang 2017;Juvekar, Vaikuntanathan, and Chandrakasan 2018;Riazi et al 2018;Rouhani, Riazi, and Koushanfar 2018;Mohassel and Rindal 2018;Riazi et al 2019;Mishra et al 2020;Rathee et al 2020;Boemer et al 2020;Zhang, Xin, and Wu 2021;Patra et al 2021;Tan et al 2021;Hussain et al 2021;Ng et al 2021;Huang et al 2022). Among them, the mixed-primitive frameworks which utilize HE to compute linear functions (e.g., convolution and fully connection) while adopt MPC for nonlinear functions (e.g., ReLU) have demonstrated additional efficiency advantages (Liu et al 2017;Juvekar, Vaikuntanathan, and Chandrakasan 2018;Rathee et al 2020;Huang et al 2022) and it is noteworthy that the inference speed has been improved…”
Section: Introductionmentioning
confidence: 99%