2020 IEEE Symposium on Security and Privacy (SP) 2020
DOI: 10.1109/sp40000.2020.00088
|View full text |Cite
|
Sign up to set email alerts
|

Towards Effective Differential Privacy Communication for Users’ Data Sharing Decision and Comprehension

Abstract: Differential privacy protects an individual's privacy by perturbing data on an aggregated level (DP) or individual level (LDP). We report four online human-subject experiments investigating the effects of using different approaches to communicate differential privacy techniques to laypersons in a health app data collection setting. Experiments 1 and 2 investigated participants' data disclosure decisions for low-sensitive and high-sensitive personal information when given different DP or LDP descriptions. Exper… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
46
2

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 33 publications
(61 citation statements)
references
References 36 publications
0
46
2
Order By: Relevance
“…While our experts generally acknowledged the individuals whose personal data is being privately computed on as an important stakeholder group, few seemed to prioritise seeking their understanding and acceptance. This is in contrast to the small number of existing HCI studies that investigate 'user acceptability' of particular privacy-preserving computation techniques such as differential privacy [19,117] and MPC [92]. User acceptability could and should be further examined in particular contexts; for instance, Colnago et al suggest further work is needed to explore whether such techniques embedded in Internet-of-Things privacy assistants might 'help mitigate people's reservations about data collection practices and reduce the chance they opt out' [24].…”
Section: Whither the End User?mentioning
confidence: 91%
See 1 more Smart Citation
“…While our experts generally acknowledged the individuals whose personal data is being privately computed on as an important stakeholder group, few seemed to prioritise seeking their understanding and acceptance. This is in contrast to the small number of existing HCI studies that investigate 'user acceptability' of particular privacy-preserving computation techniques such as differential privacy [19,117] and MPC [92]. User acceptability could and should be further examined in particular contexts; for instance, Colnago et al suggest further work is needed to explore whether such techniques embedded in Internet-of-Things privacy assistants might 'help mitigate people's reservations about data collection practices and reduce the chance they opt out' [24].…”
Section: Whither the End User?mentioning
confidence: 91%
“…Some participants explained this was because adding more noise felt like lying. Xiong et al [117] also studied participants' willingness to share data with a hypothetical differentially private system. They examined the effect of different descriptions of differential privacy (including real descriptions provided by technology companies and the U.S. Census bureau) on willingness to share, and their findings suggest that certain descriptions (in particular, implication descriptions) are more understandable and increase willingness to share data as a result.…”
Section: Related Work On Pets In Hcimentioning
confidence: 99%
“…There are a few studies in which researchers took first steps forward to achieve usable differential privacy [2,10]. For instance, Bullek et al used the randomized response technique (RRT) to describe a variant of differentially private mechanism using a spinner, i.e.…”
Section: Related Workmentioning
confidence: 99%
“…They examined whether users trust the RRT mechanism which proposes to ensure their privacy and if they adjust their privacy decisions when they see more details of the privacy promises made by RRT. Xiong et al [10] analysed the effects of using different approaches to verbally communicate differentially private techniques to laypersons in a health app data collection setting. Across different approaches of short textual descriptions, their results show that descriptions explaining implications, i.e.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation