2021
DOI: 10.1109/tbdata.2019.2916108
|View full text |Cite
|
Sign up to set email alerts
|

RoD: Evaluating the Risk of Data Disclosure Using Noise Estimation for Differential Privacy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 21 publications
0
2
0
Order By: Relevance
“…Chorus has been released as open-source for protecting individual privacy [12]. However, the DP-based PPDL models result in reducing the object classification accuracy because the large amount of added noise eventually distorts the data [27].…”
Section: Related Workmentioning
confidence: 99%
“…Chorus has been released as open-source for protecting individual privacy [12]. However, the DP-based PPDL models result in reducing the object classification accuracy because the large amount of added noise eventually distorts the data [27].…”
Section: Related Workmentioning
confidence: 99%
“…Du and Wang [13] proposed a query model and implemented differential privacy by Laplace noise. Tsou and Chen [17] quantified the disclosure risk and linked the differential privacy with k-anonymity. Zhang and Liu [18] proposed a privacy-preserving decision tree classification model based on differential privacy mechanism, through the Laplace mechanism and index mechanism, which provided users with a secure data access interface and optimized the search scheme to reduce the error rate.…”
Section: Research On the Basicmentioning
confidence: 99%