2015 IEEE 31st International Conference on Data Engineering 2015
DOI: 10.1109/icde.2015.7113353
|View full text |Cite
|
Sign up to set email alerts
|

Conservative or liberal? Personalized differential privacy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
161
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 159 publications
(163 citation statements)
references
References 31 publications
1
161
0
Order By: Relevance
“…Fan and Jin [17] as well as Jorgensen et al [32] have used non-uniform random sampling to produce aggregate data. Hong et al have used random sampling for protecting search logs [24].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Fan and Jin [17] as well as Jorgensen et al [32] have used non-uniform random sampling to produce aggregate data. Hong et al have used random sampling for protecting search logs [24].…”
Section: Related Workmentioning
confidence: 99%
“…Finally, differentially private mechanisms are typically special-purpose algorithms developed for specific applications, see e.g. [17,31,32]. Many of them serve the interactive scenario, i.e.…”
Section: Introductionmentioning
confidence: 99%
“…Jorgensen et al's personalized differential privacy (PDP) is a different approach to the same problem [15]. In contrast to UniTraX, PDP trusts analysts and assumes that per-user budgets are public.…”
Section: Related Workmentioning
confidence: 99%
“…ProPer would privately deduct the appropriate amount from the individual remaining budget of all users in that age range. By contrast, UniTraX publicly records that a certain amount of budget was consumed for the age range [10][11][12][13][14][15][16][17][18][19][20]. Because the consumed budget is public, the analyst can calculate how much initial budget any given point in the data parameter space would need in order to still have enough remaining budget for some specific query the analyst may wish to make.…”
mentioning
confidence: 99%
“…Jorgensen, Yu and Cormode [18] proposed a personalized differential privacy mechanism with a trustable data analyst. The algorithm of privacy protection is not performed on local clients.…”
Section: Personalized Privacymentioning
confidence: 99%