2014
DOI: 10.1186/1755-8794-7-s1-s14
|View full text |Cite
|
Sign up to set email alerts
|

Differentially private distributed logistic regression using private and public data

Abstract: BackgroundPrivacy protecting is an important issue in medical informatics and differential privacy is a state-of-the-art framework for data privacy research. Differential privacy offers provable privacy against attackers who have auxiliary information, and can be applied to data mining models (for example, logistic regression). However, differentially private methods sometimes introduce too much noise and make outputs less useful. Given available public data in medical research (e.g. from patients who sign ope… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
32
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 67 publications
(33 citation statements)
references
References 21 publications
1
32
0
Order By: Relevance
“…In section 3.1, the supervised learning module and the feature extraction module of the proposed framework were evaluated using the subspace (ɑ3, ɑ4, ɑ11) of the NSL-KDD data set with binary classes (9,10) only. In a new experiment, the same subspace is again considered; however, the other classes (0,1), (0,5), (1,2), (1,9), (3,5), (3,9), and (6,8) are also studied. In addition, three other subspaces, (ɑ3, ɑ4, ɑ5), (ɑ3, ɑ4, ɑ7), and (ɑ4, ɑ7, ɑ10), are also included in the experiment to study the performance of the proposed analytical framework with DPLR.…”
Section: Multiple Subspace Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…In section 3.1, the supervised learning module and the feature extraction module of the proposed framework were evaluated using the subspace (ɑ3, ɑ4, ɑ11) of the NSL-KDD data set with binary classes (9,10) only. In a new experiment, the same subspace is again considered; however, the other classes (0,1), (0,5), (1,2), (1,9), (3,5), (3,9), and (6,8) are also studied. In addition, three other subspaces, (ɑ3, ɑ4, ɑ5), (ɑ3, ɑ4, ɑ7), and (ɑ4, ɑ7, ɑ10), are also included in the experiment to study the performance of the proposed analytical framework with DPLR.…”
Section: Multiple Subspace Analysismentioning
confidence: 99%
“…[6]. Since its introduction, a significant amount of studies have been conducted using this model for achieving privacy strength and prediction/classification accuracy [8,9,12,16]. This is a parametric approach and the selection of its privacy parameter ϵ is a challenging problem.…”
Section: Introductionmentioning
confidence: 99%
“…This assumption allows the model of uncertainty in predictions. The loss function L( , y) is required to measures the difference between the predicting ̂ of a hypothesis and the expected or true outcome y [13]. The risk associated with the hypothesis h(x) is the expectation of the loss function:…”
Section: Empirical Risk Minimizationmentioning
confidence: 99%
“…privacy concerns [18]. Recently privacypreserving analysis on distributed datasets has studied in [19,20,21,22,23,24]. Some methods ensure the differential privacy only on private data and several methods take care both public and private data.…”
Section: Introductionmentioning
confidence: 99%