2015
DOI: 10.1007/978-3-319-26350-2_17
|View full text |Cite
|
Sign up to set email alerts
|

A Differentially Private Random Decision Forest Using Reliable Signal-to-Noise Ratios

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
43
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 18 publications
(43 citation statements)
references
References 11 publications
0
43
0
Order By: Relevance
“…For instance, the max operator has lower sensitivity than the Gini or information gains thus leading up to higher accuracy on similar datasets and privacy levels [61,68]. In other works, the learning is adapted using RFs [83,140,146] or CRTs [20,59,60,188]. Some works abusively consider that each tree in the forest is independent, to reduce the privacy budget consumption [146].…”
Section: Differential Privacy Based Solutionsmentioning
confidence: 99%
See 1 more Smart Citation
“…For instance, the max operator has lower sensitivity than the Gini or information gains thus leading up to higher accuracy on similar datasets and privacy levels [61,68]. In other works, the learning is adapted using RFs [83,140,146] or CRTs [20,59,60,188]. Some works abusively consider that each tree in the forest is independent, to reduce the privacy budget consumption [146].…”
Section: Differential Privacy Based Solutionsmentioning
confidence: 99%
“…Some works abusively consider that each tree in the forest is independent, to reduce the privacy budget consumption [146]. However, this is circumvented by training each tree on an independent subset of the training data and applying the parallel composition theorem [59,183]. Overall, works based on the central model consider reasonable privacy budgets (i.e., ε∈[0.1; 1.0]) and some even experiment with very low budgets (e.g., ε=0.01 for [16,20,59,114,137]).…”
Section: Differential Privacy Based Solutionsmentioning
confidence: 99%
“…Additionally, Friedman and Schuster further proposed the DiffP-C4.5 algorithm to overcome the limitation of DiffP-ID3 that it can only solve discrete attributes [13]. Rana et al [23] and Fletcher [24] proposed the methods to build differentially private random forest, which can reduce the impact of noise on model accuracy by integrating several decision trees into an ensemble.…”
Section: Related Workmentioning
confidence: 99%
“…The utility-based partitioning is inspired by an observation that many DP machine learning algorithms (e.g. [5, 7, 9, 19]) have their performance related with n , ε for a dataset of n records with ε -DP. We give definition of utility-based partitioning below.…”
Section: Partitioning Mechanismsmentioning
confidence: 99%