2021
DOI: 10.48550/arxiv.2102.05855
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Differential Privacy Dynamics of Langevin Diffusion and Noisy Gradient Descent

Abstract: We model the dynamics of privacy loss in Langevin diffusion and extend it to the noisy gradient descent algorithm: we compute a tight bound on Rényi differential privacy and the rate of its change throughout the learning process. We prove that the privacy loss converges exponentially fast. This significantly improves the prior privacy analysis of differentially private (stochastic) gradient descent algorithms, where (Rényi) privacy loss constantly increases over the training iterations. Unlike composition-base… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 19 publications
(51 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?