2014
DOI: 10.1007/978-3-662-45472-5_12
|View full text |Cite
|
Sign up to set email alerts
|

Practical Secure Decision Tree Learning in a Teletreatment Application

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
62
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 30 publications
(63 citation statements)
references
References 20 publications
1
62
0
Order By: Relevance
“…As a criterion we choose to minimize the Gini index, as also used in the CART algorithm, since it only requires a few secure multiplications to compute. 2 For discrete attributes, we follow [13] and compute the Gini index by securely counting the number of elements in the dataset that satisfy certain equations using an indicator-vector representation, as described in Section 4. For continuous attributes, the situation is more complicated since the equations are not based on equality (=), but rather a less-than-or-equal (≤) predicate.…”
Section: Overview Of Our Techniquesmentioning
confidence: 99%
See 1 more Smart Citation
“…As a criterion we choose to minimize the Gini index, as also used in the CART algorithm, since it only requires a few secure multiplications to compute. 2 For discrete attributes, we follow [13] and compute the Gini index by securely counting the number of elements in the dataset that satisfy certain equations using an indicator-vector representation, as described in Section 4. For continuous attributes, the situation is more complicated since the equations are not based on equality (=), but rather a less-than-or-equal (≤) predicate.…”
Section: Overview Of Our Techniquesmentioning
confidence: 99%
“…This issue was addressed in [13], where an extension of ID3 to the secure setting was given using Shamir secret-sharing and allowing arbitrary initial partitioning of the data. However, their protocol does not allow for continuous attributes, which is an important feature of decision trees with respect to other machine learning models.…”
Section: Related Workmentioning
confidence: 99%
“…SMC based Machine Learning. A significant body of work in PPML with SMC has focused on the problem of privacy-preserving training of machine learning models (see, e.g., [12], [13], [14], [15], [16] and references therein). Privacy-preserving protocols for predicting with trained ML classifiers -hereafter also referred to as "scoring" -has received far less attention.…”
Section: Related Workmentioning
confidence: 99%
“…Bost et al [17] introduced privacypreserving protocols for hyperplane-based, Naive Bayes and DT classifiers, Wu et al [18] for DTs and RFs, David et al [19] for hyperplane-based and Naive Bayes classifiers, and De Cock et al [20] for DTs and hyperplane-based classifiers. De Hoogh et al [14] had also previously presented a protocol for privacy-preserving scoring of DTs with categorical attributes. The protocol for privacy-preserving scoring of DTs of De Cock et al [20] cannot be directly used as a building block to obtain random forests and boosted decision trees.…”
Section: Related Workmentioning
confidence: 99%
“…In a practical application, the cold start classifier should be trained, and regularly updated by some third party service provider, meaning that possibly privacy sensitive data needs to be send from the user to this third party. A possible solution to this issue is presented by (de Hoogh et al 2014) who have used a portion of the data used for the Kairos system to develop and evaluate algorithms for privacy-preserving data mining, effectively allowing predictions to be made based on encrypted data.…”
Section: Outstanding Issuesmentioning
confidence: 99%