2022
DOI: 10.48550/arxiv.2202.03335
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Membership Inference Attacks and Defenses in Neural Network Pruning

Abstract: Neural network pruning has been an essential technique to reduce the computation and memory requirements for using deep neural networks for resource-constrained devices. Most existing research focuses primarily on balancing the sparsity and accuracy of a pruned neural network by strategically removing insignificant parameters and retraining the pruned model. Such efforts on reusing training samples pose serious privacy risks due to increased memorization, which, however, has not been investigated yet.In this p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 29 publications
0
1
0
Order By: Relevance
“…In this paper, we are interested in understanding both empirically and theoretically how overparameterization affects MI in classification. [25,34] show that pruning a network can improve MI robustness. [27] show empirically that MI tends to be easier on more challenging learning tasks.…”
Section: Related Workmentioning
confidence: 99%
“…In this paper, we are interested in understanding both empirically and theoretically how overparameterization affects MI in classification. [25,34] show that pruning a network can improve MI robustness. [27] show empirically that MI tends to be easier on more challenging learning tasks.…”
Section: Related Workmentioning
confidence: 99%