Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 2017
DOI: 10.1145/3097983.3098152
|View full text |Cite
|
Sign up to set email alerts
|

Privacy-Preserving Distributed Multi-Task Learning with Asynchronous Updates

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 73 publications
(16 citation statements)
references
References 23 publications
0
15
0
1
Order By: Relevance
“…First, while avoiding exposing users' raw data during the learning process, the model updates transmitted in ClusterFL may still reveal certain information about user activities [27,56]. In the future, we will study how to integrate privacy-preserving techniques [55] and investigate the trade-off between privacy and utility [3,53]. Second, we will extend ClusterFL to a wider range of applications, where the nodes' data exhibits intrinsic similarity and locality.…”
Section: Discussionmentioning
confidence: 99%
“…First, while avoiding exposing users' raw data during the learning process, the model updates transmitted in ClusterFL may still reveal certain information about user activities [27,56]. In the future, we will study how to integrate privacy-preserving techniques [55] and investigate the trade-off between privacy and utility [3,53]. Second, we will extend ClusterFL to a wider range of applications, where the nodes' data exhibits intrinsic similarity and locality.…”
Section: Discussionmentioning
confidence: 99%
“…In addition to introducing differential privacy into standalone machine learning, differentially private transfer learning has also been investigated [133], [134]. Transfer learning aims to transfer knowledge from source domains to improve learning performance in target domains [134].…”
Section: Future Research Directions 61 Private Transfer Learningmentioning
confidence: 99%
“…They also developed an attribute-wise noise addition scheme in their algorithm. Xie et al [26] proposed a distributed multitask framework for privacy preservation to preserve sensitive data and private information that may be contained in the distributed data. The proposed method is a privacy-preserving proximal gradient algorithm which asynchronously updates models of the learning tasks and solves a general class of multi-task learning formulations.…”
Section: B Privacy-preserving Recommentation Systemmentioning
confidence: 99%