2018
DOI: 10.2139/ssrn.3338027
|View full text |Cite
|
Sign up to set email alerts
|

Differential Privacy: A Primer for a Non-Technical Audience

Abstract: Differential privacy is a formal mathematical framework for quantifying and managing privacy risks. It provides provable privacy protection against a wide range of potential attacks, including those

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
44
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 54 publications
(53 citation statements)
references
References 19 publications
0
44
0
Order By: Relevance
“…Obfuscation is applied to aggregate patient counts that are reported as a result of ad hoc queries on the client machine [ 26 ]. Another protection model for preventing reidentification is differential privacy [ 10 , 46 ]. In this model, reidentification is prevented by the addition of noise to the data.…”
Section: Tasks and Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Obfuscation is applied to aggregate patient counts that are reported as a result of ad hoc queries on the client machine [ 26 ]. Another protection model for preventing reidentification is differential privacy [ 10 , 46 ]. In this model, reidentification is prevented by the addition of noise to the data.…”
Section: Tasks and Methodsmentioning
confidence: 99%
“…However, transforming data or anonymizing individuals may minimize the utility of the transferred data and lead to inaccurate knowledge [ 9 ]. This tradeoff between privacy and utility, also accuracy, is the center issue of sensitive data secondary usage [ 10 ]. Deidentification refers to a collection of techniques devised for removing or transforming identifiable information into nonidentifiable information and also introducing random noise into the dataset.…”
Section: Introductionmentioning
confidence: 99%
“…To address these inference attacks, clinical sites can anonymize their local statistics by applying obfuscation techniques that mainly consist in adding a certain amount of statistical noise on the aggregate-level data before transfer to third parties. This process enables data providers to achieve formal notions of privacy such as differential privacy [ 24 , 25 ]. In the statistical privacy community, differential privacy is currently considered as guaranteeing the likelihood of reidentification from the release of aggregate-level statistics can be minimized to an acceptable value.…”
Section: Privacy and Security Issues Of Current Medical Data-sharing mentioning
confidence: 99%
“…In cases where communication costs are of critical concern, more communication-efficient imputation methods are needed for handling general missing data as potential future work. Another potential limitation is that siMI, cslMI and avgmMI may not always be privacy-preserving as the summary statistics transmitted between individual sites and a central server may still leak individual-level information 24 . Particularly, the siMI method needs to transfer the entire design matrix between sites, which poses higher risk of leaking individual patients' information.…”
Section: Discussionmentioning
confidence: 99%