2021
DOI: 10.1109/tdsc.2019.2905237
|View full text |Cite
|
Sign up to set email alerts
|

Differentially Private Publication of Vertically Partitioned Data

Abstract: We present HDP-VFL, the first hybrid differentially private (DP) framework for vertical federated learning (VFL) to demonstrate that it is possible to jointly learn a generalized linear model (GLM) from vertically partitioned data with only a negligible cost, w.r.t. training time, accuracy, etc., comparing to idealized nonprivate VFL. Our work builds on the recent advances in VFL-based collaborative training among different organizations which rely on protocols like Homomorphic Encryption (HE) and Secure Multi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(11 citation statements)
references
References 57 publications
0
11
0
Order By: Relevance
“…2) The second category is the works producing a PP dataset of the original dataset [56], [73], [100], [101]. Next, we will present the relevant research work based on DP to protect users' privacy before publishing a dataset.…”
Section: Privacy-preserving Datasetsmentioning
confidence: 99%
See 1 more Smart Citation
“…2) The second category is the works producing a PP dataset of the original dataset [56], [73], [100], [101]. Next, we will present the relevant research work based on DP to protect users' privacy before publishing a dataset.…”
Section: Privacy-preserving Datasetsmentioning
confidence: 99%
“…iii) Faithfulness: The group sizes at each level l of the hierarchy should be equal to the total count of groups G. The problem is solved using three approaches: 1) Direct optimization-based mechanism, 2) dynamic programming 3) polynomial-time mechanism by exploiting the structure of the cost tables. Tang et al [73] present a stronger privacy protection approach called differentially private latent tree (DPLT). It consists of generating a new synthetic dataset from vertically partitioned data (i.e., the dataset is shared between many data curators where each one holds some attributes of the dataset.…”
Section: Privacy-preserving Datasetsmentioning
confidence: 99%
“…Regarding defences against privacy attacks based on DP in VFL, Wang et al [122] propose to perturbate the intermediate outputs shared between parties in the model training phase of a Generalized Linear Model. Additionally, such perturbation removes the requirement of a learning coordinator and the necessity of costly Homomorphic Encryption schemes, as they are already private.…”
Section: Based On Differential Privacymentioning
confidence: 99%
“…Furthermore, prior VFL frameworks primarily use crypto-based technologies such as HE and SMC to ensure secure and private learning. Recent works proposed incorporating differential privacy (DP) into the training process to provide strict privacy guarantees for local data [6,42].…”
Section: Vertical Federated Learningmentioning
confidence: 99%