2022
DOI: 10.1109/mci.2022.3180883
|View full text |Cite
|
Sign up to set email alerts
|

Privacy-Preserving Deep Learning With Homomorphic Encryption: An Introduction

Abstract: Privacy-preserving deep learning with homomorphic encryption (HE) is a novel and promising research area aimed at designing deep learning solutions that operate while guaranteeing the privacy of user data. Designing privacypreserving deep learning solutions requires one to completely rethink and redesign deep learning models and algorithms to match the severe technological and algorithmic constraints of HE. This paper provides an introduction to this complex research area as well as a methodology for designing… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
14
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 30 publications
(14 citation statements)
references
References 22 publications
(28 reference statements)
0
14
0
Order By: Relevance
“…In particular, in future we intend to continue to work applying FHE to existing networks, such as transformers, which are SotA in sequence tasks. However, much work remains in mimicking certain functions of transformers in an FHE-compatible manner [ 12 ]. We also believe while we could compute privately, we can significantly improve the performance of the predictions themselves with better performing network architectures such as transformers.…”
Section: Discussion and Limitationsmentioning
confidence: 99%
See 3 more Smart Citations
“…In particular, in future we intend to continue to work applying FHE to existing networks, such as transformers, which are SotA in sequence tasks. However, much work remains in mimicking certain functions of transformers in an FHE-compatible manner [ 12 ]. We also believe while we could compute privately, we can significantly improve the performance of the predictions themselves with better performing network architectures such as transformers.…”
Section: Discussion and Limitationsmentioning
confidence: 99%
“…Here, FHE graph parameterisation means deriving the FHE parameters from a graph, such as the computational depth and thus the parameters like the modulus size. There have been a few works that define FHE graph parameterisation, the most notable and similar of which is Microsoft Encrypted Vector Arithmetic (MS-EVA) [ 11 , 12 ]. MS-EVA uses Directed Acyclic Graphs (DAGs) to represent simple operations applied to some input constant.…”
Section: Literature and Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…In order to address privacy leakage concerns in Federated learning, researchers have proposed various solutions leveraging privacy protection technologies such as differential privacy [13], homomorphic encryption [14], and secure multi-party computation [15]. Haokun Fang et al [16] proposed a Federated learning scheme based on the Paillier additive homomorphic algorithm.…”
Section: Related Workmentioning
confidence: 99%