Proceedings of the 35th ACM SIGMOD-SIGACT-SIGAI Symposium on Principles of Database Systems 2016
DOI: 10.1145/2902251.2902291
|View full text |Cite
|
Sign up to set email alerts
|

Shortest Paths and Distances with Differential Privacy

Abstract: We introduce a model for differentially private analysis of weighted graphs in which the graph topology (V, E) is assumed to be public and the private information consists only of the edge weights w : E → R + . This can express hiding congestion patterns in a known system of roads. Differential privacy requires that the output of an algorithm provides little advantage, measured by privacy parameters ǫ and δ, for distinguishing between neighboring inputs, which are thought of as inputs that differ on the contri… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
62
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 36 publications
(66 citation statements)
references
References 22 publications
2
62
0
Order By: Relevance
“…We may consider the sensitivity of the MST cost when perturbing a single edge weight of the graph, but keeping the topology of the graph fixed. This is similar to the privacy model considered by Sealfon [61].…”
Section: Background On Sketchingsupporting
confidence: 61%
“…We may consider the sensitivity of the MST cost when perturbing a single edge weight of the graph, but keeping the topology of the graph fixed. This is similar to the privacy model considered by Sealfon [61].…”
Section: Background On Sketchingsupporting
confidence: 61%
“…Instead, a more suitable setting is to define differential privacy in the context of neighboring weight functions for the given graph. This was first formally introduced in [122]. A use case of this is a navigation system that has access to a public map and road user-traffic data and is promised to keep user data private.…”
Section: Edge Weight Differential Privacymentioning
confidence: 99%
“…Difference Sequence [125] Degree sequence: [30,66,109] Subgraph Counting: [11,32,66] Edge Weight DP [122] Local DP [131,150] Graph Mining…”
Section: Introductionmentioning
confidence: 99%
“…Many adaptations of DP simply change the neighborhood definition to protect different types of input data than datasets. DP was adopted to graph-structured data in [38,85,136,140,144,151,153], to streaming data in [51,55,56,64,102], to symbolic control systems in [93], to text vectors in [175], to set operations in [172], to images in [174], to genomic data in [147], to recommendation systems in [80], to machine learning in [119], to location data in [28], to outsourced database systems in [101], to bandit algorithms in [12,156], to RAMs in [3,21,158] and to Private Information Retrieval in [135,155]. We detail the corresponding definitions in the full version of this work.…”
Section: Applying the Definition To Other Types Of Inputmentioning
confidence: 99%