Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2016
DOI: 10.1587/transinf.2015inp0020
|View full text |Cite
|
Sign up to set email alerts
|

Privacy-Preserving Logistic Regression with Distributed Data Sources via Homomorphic Encryption

Abstract: SUMMARYLogistic regression is a powerful machine learning tool to classify data. When dealing with sensitive or private data, cares are necessary. In this paper, we propose a secure system for privacy-protecting both the training and predicting data in logistic regression via homomorphic encryption. Perhaps surprisingly, despite the non-polynomial tasks of training and predicting in logistic regression, we show that only additively homomorphic encryption is needed to build our system. Indeed, we instantiate ou… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
45
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 76 publications
(46 citation statements)
references
References 25 publications
(44 reference statements)
1
45
0
Order By: Relevance
“…In section 3.1, the supervised learning module and the feature extraction module of the proposed framework were evaluated using the subspace (ɑ3, ɑ4, ɑ11) of the NSL-KDD data set with binary classes (9,10) only. In a new experiment, the same subspace is again considered; however, the other classes (0,1), (0,5), (1,2), (1,9), (3,5), (3,9), and (6,8) are also studied. In addition, three other subspaces, (ɑ3, ɑ4, ɑ5), (ɑ3, ɑ4, ɑ7), and (ɑ4, ɑ7, ɑ10), are also included in the experiment to study the performance of the proposed analytical framework with DPLR.…”
Section: Multiple Subspace Analysismentioning
confidence: 99%
See 2 more Smart Citations
“…In section 3.1, the supervised learning module and the feature extraction module of the proposed framework were evaluated using the subspace (ɑ3, ɑ4, ɑ11) of the NSL-KDD data set with binary classes (9,10) only. In a new experiment, the same subspace is again considered; however, the other classes (0,1), (0,5), (1,2), (1,9), (3,5), (3,9), and (6,8) are also studied. In addition, three other subspaces, (ɑ3, ɑ4, ɑ5), (ɑ3, ɑ4, ɑ7), and (ɑ4, ɑ7, ɑ10), are also included in the experiment to study the performance of the proposed analytical framework with DPLR.…”
Section: Multiple Subspace Analysismentioning
confidence: 99%
“…The second column lists the maximum classification accuracies recorded for each classification task. For example, DPLR classifies the classes (1,9) with about 99% accuracy, whereas it classifies the classes (3,5) with about 53% accuracy. Since, DPLR's classification performance is poor for the binary class (3,5), classification accuracies are obtained for other subspaces, (ɑ3, ɑ4, ɑ5), (ɑ3, ɑ4, ɑ7), and (ɑ4, ɑ7, ɑ10) other subspaces as well, and listed in the third, fourth, and fifth columns of the table.…”
Section: Supervised Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…Linear regression (whose outputs are continuous) is related but different from logistic regression (whose outputs are discrete). Various privacy-preserving systems for logistic regression have been built in [5]- [7], [19].…”
Section: More Related Workmentioning
confidence: 99%
“…For securing data, full encryption with provable security (like RSA, AES, etc) is the most secure option. However, many multimedia applications have been seeking a trade-off in security to enable other requirements, e.g., low processing demands, retaining bitstream compliance, and flexible processing in the encrypted domain, so that a lot of perceptual encryption schemes have been studied as one of the schemes for achieving a trade-off [4]- [13] In the recent years, considerable efforts have been made in the fields of fully homomorphic encryption and multi-party computation [14]- [17]. However, these schemes can not be applied yet to SVM algorithms, although it is possible to carry out some statistical analysis of categorical and ordinal data.…”
Section: Introductionmentioning
confidence: 99%