2018
DOI: 10.1186/s12920-018-0401-7
|View full text |Cite
|
Sign up to set email alerts
|

Logistic regression model training based on the approximate homomorphic encryption

Abstract: BackgroundSecurity concerns have been raised since big data became a prominent tool in data analysis. For instance, many machine learning algorithms aim to generate prediction models using training data which contain sensitive information about individuals. Cryptography community is considering secure computation as a solution for privacy protection. In particular, practical requirements have triggered research on the efficiency of cryptographic primitives.MethodsThis paper presents a method to train a logisti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
148
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 144 publications
(156 citation statements)
references
References 19 publications
1
148
0
Order By: Relevance
“…HE-Based Approaches. Graepel et al (Graepel, Lauter, and Naehrig 2012) presented a homomorphic evaluation algorithm of two binary classifiers (i.e., linear means and Fisher's linear discriminant classifiers), and Kim et al (Kim et al 2018b;2018a) proposed a homomorphic evaluation of logistic regression. However, they provided only a proofof-concept evaluation, where small-scale training datasets (consisting of only dozens of samples and features) are considered.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…HE-Based Approaches. Graepel et al (Graepel, Lauter, and Naehrig 2012) presented a homomorphic evaluation algorithm of two binary classifiers (i.e., linear means and Fisher's linear discriminant classifiers), and Kim et al (Kim et al 2018b;2018a) proposed a homomorphic evaluation of logistic regression. However, they provided only a proofof-concept evaluation, where small-scale training datasets (consisting of only dozens of samples and features) are considered.…”
Section: Related Workmentioning
confidence: 99%
“…In addition to the sheer amount of computation, the use of various complex operations, such as floating-point arithmetic and non-polynomial functions (e.g., sigmoid), makes it challenging to apply HE to machine learning algorithms. Indeed, HEs have been applied to machine learning algorithms only in non-realistic settings (Graepel, Lauter, and Naehrig 2012;Kim et al 2018a) where only small-size training data over a small number of features are considered; or, they have been applied only on the prediction phase (Gilad-Bachrach et al 2016;Li et al 2017;Juvekar, Vaikuntanathan, and Chandrakasan 2018;Bourse et al 2017) where the amount of computation is much smaller than that of the training phase.…”
Section: Introductionmentioning
confidence: 99%
“…One scenario that was considered in previous works [4,5,6] is the setting in which a data owner holds the data while another party (the data processor), such as a cloud service, is responsible for the model training. These solutions usually rely on homomorphic encryption, with the data owner encrypting and sending their data to the data processor who performs computations on the encrypted data without having to decrypt it.…”
Section: Related Workmentioning
confidence: 99%
“…Machine Learning as an application of FHE was first proposed in [35], and subsequently there have been numerous works on the subject, to our knowledge all concerned with supervised learning. The most popular of these applications seem to be (Deep) Neural Networks (see [26], [21], [10], [36], and [7]) and (Linear) Regression (e.g., [32], [17], [4] or [22]), though there is also some work on other algorithm classes like decision trees and random forests ( [41]), or logistic regression ( [6], [30], [29] and [5]). In contrast, our work is concerned with the clustering problem from unsupervised Machine Learning.…”
Section: Related Workmentioning
confidence: 99%