2022
DOI: 10.1002/int.22944
|View full text |Cite
|
Sign up to set email alerts
|

An improved stochastic gradient descent algorithm based on Rényi differential privacy

Abstract: Deep learning techniques based on the neural network have made significant achievements in various fields of artificial intelligence. However, model training requires large‐scale data sets, these data sets are crowd‐sourced and model parameters will contain the encoding of private information, resulting in the risk of privacy leakage. With the trend toward sharing pretrained models, the risk of stealing training data sets through member inference attacks and model inversion attacks is further heightened. To ta… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 22 publications
0
1
0
Order By: Relevance
“…To further improve the protection level of personal data privacy, differential privacy has become a commonly used privacy protection technology. Differential privacy protects sensitive personal information by adding noise to the original data to alter the dataset [5][6] . In privacy data, data imbalance refers to a situation where the sample size varies greatly between different categories in a dataset.…”
Section: Introductionmentioning
confidence: 99%
“…To further improve the protection level of personal data privacy, differential privacy has become a commonly used privacy protection technology. Differential privacy protects sensitive personal information by adding noise to the original data to alter the dataset [5][6] . In privacy data, data imbalance refers to a situation where the sample size varies greatly between different categories in a dataset.…”
Section: Introductionmentioning
confidence: 99%