2014
DOI: 10.1145/2611523
|View full text |Cite
|
Sign up to set email alerts
|

Private Analysis of Graph Structure

Abstract: We present efficient algorithms for releasing useful statistics about graph data while providing rigorous privacy guarantees. Our algorithms work on data sets that consist of relationships between individuals, such as social ties or email communication. The algorithms satisfy edge differential privacy, which essentially requires that the presence or absence of any particular relationship be hidden.Our algorithms output approximate answers to subgraph counting queries. Given a query graph H, e.g., a triangle, k… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
196
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 135 publications
(198 citation statements)
references
References 31 publications
2
196
0
Order By: Relevance
“…The two main models which have been considered for applying differential privacy to network structured data are edge and node differential privacy [HLMJ09,BBDS13,KNRS13,KRSY11]. Under edge differential privacy, two graphs are considered to be neighbors if one graph can be obtained from the other by adding or removing a single edge.…”
Section: Previous Work On Graphs and Differential Privacymentioning
confidence: 99%
“…The two main models which have been considered for applying differential privacy to network structured data are edge and node differential privacy [HLMJ09,BBDS13,KNRS13,KRSY11]. Under edge differential privacy, two graphs are considered to be neighbors if one graph can be obtained from the other by adding or removing a single edge.…”
Section: Previous Work On Graphs and Differential Privacymentioning
confidence: 99%
“…Fortunately, real-world graphs typically do not contain such pathological structures, and tend to have local sensitivities that are much lower. Triangle counting has been extensively studied in the context of differential privacy (e.g., [25,37,13,5]). The most effective approaches take advantage of the above observation in some way to reduce the amount of noise required to satisfy differential privacy.…”
Section: C32 Triangle Countmentioning
confidence: 99%
“…Request permissions from permissions@acm.org. degree distribution [11], subgraph counting [13,2], clustering coefficient [35], and frequent subgraph mining [33]; and (2) private graph release, which typically involves (privately) fitting a generative graph model to input graphs in order to sample a synthetic graph, which can be used in analyses as a proxy for a real input graph (e.g., [23,30,34,4,36,18,29]). We follow this latter approach.…”
Section: Introductionmentioning
confidence: 99%
“…Based on the concept of differential privacy introduced in [21], there had been many work focused on publishing aggregates of sensitive datasets through differential privacy constraints [23], [3], [7]. Differential privacy had also been applied to protecting sensitive information in graph datasets such that the released information does not reveal the presence of a sensitive element [12], [13], [18]. Recent work had focused on publishing graph datasets through differential privacy constraints so that the published graph maintains as much structural properties as possible as the original graph while providing the required privacy [16].…”
Section: Related Workmentioning
confidence: 99%