ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2019
DOI: 10.1109/icassp.2019.8682829
|View full text |Cite
|
Sign up to set email alerts
|

Differentially Private Compressive K-means

Abstract: This work addresses the problem of learning from large collections of data with privacy guarantees. The sketched learning framework proposes to deal with the large scale of datasets by compressing them into a single vector of generalized random moments, from which the learning task is then performed. We modify the standard sketching mechanism to provide differential privacy, using addition of Laplace noise combined with a subsampling mechanism (each moment is computed from a subset of the dataset). The data ca… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(8 citation statements)
references
References 24 publications
(28 reference statements)
0
8
0
Order By: Relevance
“…An interesting perspective is to characterize such schemes in terms of tradeoffs between differential privacy and ability to learn from the resulting sketch. Preliminary results in this direction have been recently achieved Schellekens et al [2019].…”
Section: Discussionmentioning
confidence: 98%
“…An interesting perspective is to characterize such schemes in terms of tradeoffs between differential privacy and ability to learn from the resulting sketch. Preliminary results in this direction have been recently achieved Schellekens et al [2019].…”
Section: Discussionmentioning
confidence: 98%
“…The NSR was indeed shown empirically [45] to be a good proxy to estimate the utility of a sketching mechanism for the task of clustering where performance is measured with the SSE (sum of squared errors) defined in (2). Figures 4 and 5 give an overview of this correlation.…”
Section: Noise To Signal Ratio As a Proxy For Utilitymentioning
confidence: 99%
“…A first and reduced version of this work with privacy upper bounds and without the subsampling mechanism has been previously published[45].…”
mentioning
confidence: 99%
“…Another common strategy for private clustering is based on differential privacy (DP) [6,13,18,22]. DP consists on adding "statistical noise" that is significant enough to protect client's privacy, but small enough to not affect the model performance.…”
Section: Privacy Preserving Distributed K-meansmentioning
confidence: 99%