2019
DOI: 10.2478/popets-2019-0019
|View full text |Cite
|
Sign up to set email alerts
|

Together or Alone: The Price of Privacy in Collaborative Learning

Abstract: Machine learning algorithms have reached mainstream status and are widely deployed in many applications. The accuracy of such algorithms depends significantly on the size of the underlying training dataset; in reality a small or medium sized organization often does not have the necessary data to train a reasonably accurate model. For such organizations, a realistic solution is to train their machine learning models based on their joint dataset (which is a union of the individual ones). Unfortunately, privacy c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
20
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 16 publications
(20 citation statements)
references
References 11 publications
0
20
0
Order By: Relevance
“…On the practical side, we studied a Recommendation System use case: we applied two different privacy-preserving mechanisms on two real-world datasets and confirmed via experiments that the assumption which ensures the existence of a NE holds. Complementary to the CoL game, we interpolated Φ for our use case, and devised a possible way to approximate it in real-world scenarios [7].…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…On the practical side, we studied a Recommendation System use case: we applied two different privacy-preserving mechanisms on two real-world datasets and confirmed via experiments that the assumption which ensures the existence of a NE holds. Complementary to the CoL game, we interpolated Φ for our use case, and devised a possible way to approximate it in real-world scenarios [7].…”
Section: Resultsmentioning
confidence: 99%
“…To break this loop, we propose an approach called Self-Division in [7]. Via this approximation, the players determine Φ n , which can be used with the CoL game to find the optimal privacy parameter p * n .…”
Section: Remarksmentioning
confidence: 99%
See 2 more Smart Citations
“…[7,15] propose game theoretic methods that provide the means to evaluate the monetary cost of differential privacy. Pejó et al [36] also propose a game theoretic cost model in the setting of private collaborative learning. Our approach is inspired by the approach in the work of Hsu et al [19].…”
Section: Related Workmentioning
confidence: 99%