2020
DOI: 10.1016/j.neunet.2020.02.001
|View full text |Cite
|
Sign up to set email alerts
|

Preserving differential privacy in deep neural networks with relevance-based adaptive noise imposition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
22
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 47 publications
(25 citation statements)
references
References 44 publications
0
22
1
Order By: Relevance
“…One of the most common privacy attacks is an inference attack where the adversary maliciously infers sensitive features and background information about a target individual from a trained model [60]. Typically, there are two types of inference attacks in deep learning.…”
Section: Privacy Attacks In the Deep Neural Networkmentioning
confidence: 99%
“…One of the most common privacy attacks is an inference attack where the adversary maliciously infers sensitive features and background information about a target individual from a trained model [60]. Typically, there are two types of inference attacks in deep learning.…”
Section: Privacy Attacks In the Deep Neural Networkmentioning
confidence: 99%
“…Recently, there have been some further works on differentially private deep learning [14,16,20,21,29,30,37,41,43,44]. Most of them are the variants of the private SGD algorithm that is based on the gradient perturbation.…”
Section: Related Workmentioning
confidence: 99%
“…Following the work of Reference [41], Xu et al [43] proposed an adaptive and fast convergent differentially private algorithm that adjusted the learning rate in an adaptive manner and added the sensitivity-dependent noise to obtain the more proper privacy protection. Some other work such as References [14,30] paid attention to the relevance of different features to the model output and presented the adaptive noise addition mechanism based on the relevance analysis. Besides, some researchers undertook their work on privacy loss accounting.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The second facet of the research is to assess the sensitivity level of the information that is disseminated from the database based on the analysts" queries. The issue of utility-based privacy controlling data mining was reviewed in [11]. In [12], a technique for the suppression of anonymization of data.…”
Section: Related Workmentioning
confidence: 99%