2017
DOI: 10.1007/978-3-319-57454-7_48
|View full text |Cite
|
Sign up to set email alerts
|

Partitioning-Based Mechanisms Under Personalized Differential Privacy

Abstract: Differential privacy has recently emerged in private statistical aggregate analysis as one of the strongest privacy guarantees. A limitation of the model is that it provides the same privacy protection for all individuals in the database. However, it is common that data owners may have different privacy preferences for their data. Consequently, a global differential privacy parameter may provide excessive privacy protection for some users, while insufficient for others. In this paper, we propose two partitioni… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(8 citation statements)
references
References 17 publications
(24 reference statements)
0
8
0
Order By: Relevance
“…Similarly, heterogeneous differential privacy was formally defined in [10] to provide user-item specific privacy requirements. There have also been other efforts to model user-specific privacy requirements [29]- [32]. [29]- [31].…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Similarly, heterogeneous differential privacy was formally defined in [10] to provide user-item specific privacy requirements. There have also been other efforts to model user-specific privacy requirements [29]- [32]. [29]- [31].…”
Section: Related Workmentioning
confidence: 99%
“…There have also been other efforts to model user-specific privacy requirements [29]- [32]. [29]- [31]. For instance, [29] provides partitioning mechanism to group users with personalized privacy requirements into different ǫ partitions under a non-federated setting.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Data owners may have different pri vacy preferences and the same privacy preserving provided for all individuals will limit the accuracy of the model. Therefore, Li et al [57] proposed the privacyaware mechanism and utility based partitioning mechanism to acquire a better performance. The former is to minimize the waste of privacy budget whereas the latter is to maximize the utility for a given aggregate analysis.…”
Section: ) Differentially Private Ermmentioning
confidence: 99%
“…Second, perturbing the entire database (e.g., sufficient statistics) might be useful for hypothesis testing and D ----------------machine learning but individual trajectories of the medical records might be destroyed. A recent study proposes the concept of "personalized privacy" [15], which allows individuals to set their own thresholds to determine the degree of protection. However, these methods do not consider the "event-level privacy" of individual patients as the methods do not discriminate different events of individual patients' EHRs.…”
Section: Introductionmentioning
confidence: 99%