IEEE INFOCOM 2018 - IEEE Conference on Computer Communications 2018
DOI: 10.1109/infocom.2018.8486352
|View full text |Cite
|
Sign up to set email alerts
|

InPrivate Digging: Enabling Tree-based Distributed Data Mining with Differential Privacy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
74
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 81 publications
(78 citation statements)
references
References 16 publications
0
74
0
Order By: Relevance
“…Secondly, it has a rigorous mathematical model, which facilitates quantitative theoretical analysis and proof of privacy levels. In [13], the application of differential privacy in data protection and data mining is demonstrated. Differential privacy is now used in social networks, recommendation systems, network tracking analysis and many other fields.…”
Section: A Data Privacy Protection In Iotmentioning
confidence: 99%
“…Secondly, it has a rigorous mathematical model, which facilitates quantitative theoretical analysis and proof of privacy levels. In [13], the application of differential privacy in data protection and data mining is demonstrated. Differential privacy is now used in social networks, recommendation systems, network tracking analysis and many other fields.…”
Section: A Data Privacy Protection In Iotmentioning
confidence: 99%
“…On the other hand, Gradient Boosting Decision Trees (GBDTs) have become very successful in recent years by winning many awards in machine learning and data mining competitions (Chen and Guestrin 2016) as well as their effectiveness in many applications (Richardson, Dominowska, and Ragno 2007;Kim et al 2009;Burges 2010;Li et al 2019b). There have been several recent studies on how to train GBDTs in the federated learning setting (Cheng et al 2019;Liu et al 2019;Zhao et al 2018). For example, SecureBoost (Cheng et al 2019) developed vertical learning with GBDTs.…”
Section: Introductionmentioning
confidence: 99%
“…There have been several studies of GDBT training in the setting of horizontal learning (Liu et al 2019;Zhao et al 2018). However, those approaches are not effective or efficient enough for practical use.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…There are two major ways of communications in federated learning: centralized design and distributed design. In the distributed design, communication is performed between two participants, and each participant can directly update global parameters [21]. The data of different participants has the characteristics of independent and identical distribution, which is very suitable for designing efficient training algorithms.…”
Section: Introductionmentioning
confidence: 99%