Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence 2018
DOI: 10.24963/ijcai.2018/547
|View full text |Cite
|
Sign up to set email alerts
|

GELU-Net: A Globally Encrypted, Locally Unencrypted Deep Neural Network for Privacy-Preserved Learning

Abstract: Privacy is a fundamental challenge for a variety of smart applications that depend on data aggregation and collaborative learning across different entities. In this paper, we propose a novel privacy-preserved architecture where clients can collaboratively train a deep model while preserving the privacy of each client’s data. Our main strategy is to carefully partition a deep neural network to two non-colluding parties. One party performs linear computations on encrypted data utilizing a less complex homomorphi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
57
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 60 publications
(58 citation statements)
references
References 12 publications
0
57
0
Order By: Relevance
“…They consider collaborative training, i.e., the training is performed collaboratively between different participants or, individual training where the training is performed by a single participant, such as a client that wants to take advantage of a cloud to train its own model. In the latter case, some techniques ( [8], [52], and [31]) might allow a cloud model to be trained by successive participants. Noting that some collaborative learning techniques ( [20] and [23]), can also be applied for individual training.…”
Section: Classificationmentioning
confidence: 99%
See 2 more Smart Citations
“…They consider collaborative training, i.e., the training is performed collaboratively between different participants or, individual training where the training is performed by a single participant, such as a client that wants to take advantage of a cloud to train its own model. In the latter case, some techniques ( [8], [52], and [31]) might allow a cloud model to be trained by successive participants. Noting that some collaborative learning techniques ( [20] and [23]), can also be applied for individual training.…”
Section: Classificationmentioning
confidence: 99%
“…A more recent solution based on encryption was described by Zhang et al [52]. A client, willing to contribute to the training of the model, sends its data encrypted using Paillier scheme to the server, which performs all possible neural network computations except of non-linear activation functions.…”
Section: B1) Server-based Individual Pp Model Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…However, FHE is extremely costly in computation, thus unsuitable for large‐scale neural networks. To resolve this problem of existing FHE‐based approaches, Zhang et al proposed a new network named GELU‐Net, of which intrinsic strategy is to split each neuron into linear and nonlinear components and implement them separately on noncolluding parties.…”
Section: Related Workmentioning
confidence: 99%
“…For example, Cheon et al [5] proposed an ensemble method for logistic regreesion based on HE, which resulted in substantial improvement on the performance of logistic regression. Zhang et al [6] proposed GELU-Net, which was a novel privacy-preserving architecture where clients can collaboratively train a deep neural network model. Their experiments demonstrated the stability in training and time speed-up without accuracy loss.…”
Section: Introductionmentioning
confidence: 99%