2023
DOI: 10.56553/popets-2023-0045
|View full text |Cite
|
Sign up to set email alerts
|

Lessons Learned: Surveying the Practicality of Differential Privacy in the Industry

Abstract: Since its introduction in 2006, differential privacy has emerged as a predominant statistical tool for quantifying data privacy in academic works. Yet despite the plethora of research and open-source utilities that have accompanied its rise, with limited exceptions, differential privacy has failed to achieve widespread adoption in the enterprise domain. Our study aims to shed light on the fundamental causes underlying this academic-industrial utilization gap through detailed interviews of 24 privacy practition… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 62 publications
0
2
0
Order By: Relevance
“…Concurrent work by Munilla Garrido et al [27] investigates the 'academic-industrial DP utilization gap' through interviews with data analysts and data stewards in major companies that have not yet deployed DP. Their findings about the barriers to adoption of DP, as well as the promises of using DP, support the findings in this paper.…”
Section: Defining Users Of Privacy Toolsmentioning
confidence: 99%
“…Concurrent work by Munilla Garrido et al [27] investigates the 'academic-industrial DP utilization gap' through interviews with data analysts and data stewards in major companies that have not yet deployed DP. Their findings about the barriers to adoption of DP, as well as the promises of using DP, support the findings in this paper.…”
Section: Defining Users Of Privacy Toolsmentioning
confidence: 99%
“…By integrating differential privacy techniques into the data generation process, synthetic datasets can be created with privacy assurances (10; 11; 13; 21). However, it's important to note that although differential privacy provides statistical privacy guarantees, the addition of significant noise can reduce the usefulness of the data for ML development, especially when the original data itself may be inherently noisy (22). Moreover, the absence of a standardized approach or metric for implementing differential privacy in ML poses challenges in terms of implementation, evaluation, and practical deployment.…”
Section: Related Workmentioning
confidence: 99%
“…Because virtual reality devices generate a stream of data relating to the motion of a user, and people are known to subconsciously reveal information about themselves via their motion, it is natural to question the extent to which personal attributes are inferable in VR. According to popular literature reviews of the field [10,14,16,19,23,24,34,39,42,50,61,70], the vast majority of existing VR privacy research has focused on passive observation.…”
Section: Related Workmentioning
confidence: 99%
“…It has long been understood that individuals exhibit distinct biomechanical motion patterns that can be used to identify them or infer their personal attributes [13,27,31,33,48,53], which researchers have shown can be exploited to identify and profile users in VR [23,35,41,44]. While existing work has largely focused on passive observation of VR users, the success of games specifically designed to harvest user data [25] on conventional social platforms motivates us to investigate similar active attacks in VR.…”
Section: Introductionmentioning
confidence: 99%