2016
DOI: 10.1609/aaai.v30i1.10165
|View full text |Cite
|
Sign up to set email alerts
|

Differential Privacy Preservation for Deep Auto-Encoders: an Application of Human Behavior Prediction

Abstract: In recent years, deep learning has spread beyond both academia and industry with many exciting real-world applications. The development of deep learning has presented obvious privacy issues. However, there has been lack of scientific study about privacy preservation in deep learning. In this paper, we concentrate on the auto-encoder, a fundamental component in deep learning, and propose the deep private auto-encoder (dPA). Our main idea is to enforce ε-differential privacy by perturbing the objective functions… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
21
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 107 publications
(22 citation statements)
references
References 29 publications
0
21
0
Order By: Relevance
“…Finally, we used these 2000-size DP representations of PD 1 patients to build our final dpClassM model. For the TCGA dataset, Figures 3A, B shows the comparison (mean accuracy and AUC of 10-fold cross-validation) of dpClassM against the baselines (25,31,32) with the same privacy budget (∈ = 1.0). In Figures 3A, B, the x-axis represents the ER status (ER+/-) and eight pairs of cancer types (number cases of the Supplementary Table S1) that we choose to perform our experiments.…”
Section: Differential Private Classifiersmentioning
confidence: 99%
See 3 more Smart Citations
“…Finally, we used these 2000-size DP representations of PD 1 patients to build our final dpClassM model. For the TCGA dataset, Figures 3A, B shows the comparison (mean accuracy and AUC of 10-fold cross-validation) of dpClassM against the baselines (25,31,32) with the same privacy budget (∈ = 1.0). In Figures 3A, B, the x-axis represents the ER status (ER+/-) and eight pairs of cancer types (number cases of the Supplementary Table S1) that we choose to perform our experiments.…”
Section: Differential Private Classifiersmentioning
confidence: 99%
“…They used FM(( 24)) to perturb the objective function's coefficients to build DP-DNN. However, FM(( 24)) follows ∈-DP which may affect the performance of Phan et al (32) framework in many real life applications (29).…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Using differential privacy technology can effectively protect users' privacy when they provide information to ISPs or service providers (SPs). The effect of differential privacy can also be enhanced with DL technologies [152], [153]. In [154], the authors proposed a new method for learning and a refined analysis of privacy costs within the framework of differential privacy.…”
Section: Privacy Preservation In Slice Resource Managementmentioning
confidence: 99%