2016 Annual Conference on Information Science and Systems (CISS) 2016
DOI: 10.1109/ciss.2016.7460488
|View full text |Cite
|
Sign up to set email alerts
|

Privacy-preserving source separation for distributed data using independent component analysis

Abstract: Building good feature representations and learning hidden source models typically requires large sample sizes. In many applications, however, the size of the sample at an individual data holder may not be sufficient. One such application is neuroimaging analyses for mental health disorders -there are many individual research groups, each with a moderate number of subjects. Pooling such data can enable efficient feature learning, but privacy concerns prevent sharing the underlying data. We propose a model for p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2016
2016
2019
2019

Publication Types

Select...
4
3

Relationship

4
3

Authors

Journals

citations
Cited by 8 publications
(13 citation statements)
references
References 17 publications
0
13
0
Order By: Relevance
“…This paper proposes new privacy-preserving algorithms for distributed PCA and OTD and builds upon our earlier work on distributed differentially private eigenvector calculations [17] and centralized differentially private OTD [21]. It improves on our preliminary works on distributed private PCA [17,22] in terms of efficiency and fault-tolerance. Wang and Anandkumar [23] recently proposed an algorithm for differentially private tensor decomposition using a noisy version of the tensor power iteration [3,8].…”
Section: Introductionmentioning
confidence: 89%
“…This paper proposes new privacy-preserving algorithms for distributed PCA and OTD and builds upon our earlier work on distributed differentially private eigenvector calculations [17] and centralized differentially private OTD [21]. It improves on our preliminary works on distributed private PCA [17,22] in terms of efficiency and fault-tolerance. Wang and Anandkumar [23] recently proposed an algorithm for differentially private tensor decomposition using a noisy version of the tensor power iteration [3,8].…”
Section: Introductionmentioning
confidence: 89%
“…Experimentally, we compare our proposed algorithm with the existing stateof-the-art algorithm and a non-private algorithm. We show that the proposed algorithm outperforms the conventional privacy-preserving algorithm [1] and Figure 1: The structure of the network: left -conventional, right -CAPE can provide utility very close to that of the non-private algorithm [11] for some parameter choices. We analyze the variation of utility with different privacy levels, number of samples and some other key parameters.…”
Section: Introductionmentioning
confidence: 90%
“…In this paper, we propose a new algorithm, capeDJICA, for ( , δ)-DP decentralized joint ICA. The algorithm significantly improves upon our earlier work [1] by taking advantage of a recently proposed correlation assisted private estimation (CAPE) protocol [19]. Our method adds correlated noise to the output of each site to guarantee privacy locally and a central aggregator combines these noisy outputs to produce an improved estimate.…”
Section: Introductionmentioning
confidence: 92%
“…Additional extensions to the analysis provided here include reducing the bandwidth of the method and designing privacy-preserving variants that guarantee differential privacy, which we have previously investigated for simulated cases [51]. In such cases, reducing the iteration complexity will help guarantee more privacy and hence incentivize larger research collaborations.…”
Section: Conclusion and Futureworkmentioning
confidence: 99%