2019
DOI: 10.1016/j.cose.2018.12.017
|View full text |Cite
|
Sign up to set email alerts
|

Achieving correlated differential privacy of big data publication

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
17
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(17 citation statements)
references
References 5 publications
0
17
0
Order By: Relevance
“…Information Theory Approach [13] Statistical Correlation Analysis Method [15] Divide and Conquer Technique for Correlated Big Data Privacy [16] Exhaustive Combinations, Dependent DP [17] Concept of Privacy Leakage with Time [18]…”
Section: Sensitivity Analysis Approachmentioning
confidence: 99%
See 2 more Smart Citations
“…Information Theory Approach [13] Statistical Correlation Analysis Method [15] Divide and Conquer Technique for Correlated Big Data Privacy [16] Exhaustive Combinations, Dependent DP [17] Concept of Privacy Leakage with Time [18]…”
Section: Sensitivity Analysis Approachmentioning
confidence: 99%
“…It was the existence of correlation in the datasets [11][9] [12]. Initial researches around DP ignored its existence and regarded data as IID but later researches showed that real datasets often had high correlation among them and hence researches including data correlation became primarily significant [13][14] [15]. Many researchers have studied the adverse effects of data correlation on data privacy and various approaches have been proposed by different researchers in their published works to deal with the issue of privacy threat due to the existence of data correlation in datasets.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Dwork et al [5] firstly propose the differential privacy to rigorously protect individual privacy. en, many works [16][17][18][19][20] apply it to the static dataset, which will not be updated in the future.…”
Section: Related Workmentioning
confidence: 99%
“…Among the existing research, Gaussian mechanism (GM) which draws noises from Gaussian distribution, combing with MA [11] which provides a tight estimates of privacy loss, has been widely applied in various SGD based machine learning tasks [5], [11], [12], [17]- [19]. Besides, considerable research has been conducted to improve the tradeoff from other perspectives, which can be categorized as sensitivity calibration [19], [20] that aims to improve the utility by adaptively calibrate the sensitivity, dimension reduction [21] and transformations [22] that aim to reduce the sensitivity in the lower-dimension or the transformed space instead of the original space, correlation exploration [23], [24] that aims to introduce fewer noises according to the correlation between gradients and target parameters, post-processing that applies denoising technique on the noisy gradient to improve the trained model utility [25]. However, these strategies are usually problem dependent.…”
Section: Introductionmentioning
confidence: 99%