2022
DOI: 10.48550/arxiv.2202.05089
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Backpropagation Clipping for Deep Learning with Differential Privacy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…Tempered sigmoids activations (TSA) control the gradient norm to avoid introducing too much noise. Stevens et al [11] propose the backpropagation clipping (BC) algorithm to replace the gradient clipping operation, which limits the sensitivity of the gradient by clipping the gradient clips each trainable layer's inputs and its upstream gradients. To alleviate clipping bias, Liu et al [12] propose a differentially private learning with grouped gradient clipping (DPL-GGC) method.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Tempered sigmoids activations (TSA) control the gradient norm to avoid introducing too much noise. Stevens et al [11] propose the backpropagation clipping (BC) algorithm to replace the gradient clipping operation, which limits the sensitivity of the gradient by clipping the gradient clips each trainable layer's inputs and its upstream gradients. To alleviate clipping bias, Liu et al [12] propose a differentially private learning with grouped gradient clipping (DPL-GGC) method.…”
Section: Related Workmentioning
confidence: 99%
“…The cause lies in the fact the more privacy budget means the weaker privacy guarantee, and less noise is injected into the model. [10] 66.20 7.53 BC [11] 74.00 3.64 DPL-GGC [12] 67.11 3.19 ADPSGD [15] 69.63 6.40 DPNASNet [16] 68.33 3.00 AFRRS 76.00 2.00…”
Section: Tablementioning
confidence: 99%