2019
DOI: 10.1109/access.2019.2947295
|View full text |Cite
|
Sign up to set email alerts
|

Differential Privacy for Data and Model Publishing of Medical Data

Abstract: Combining medical data and machine learning has fully utilized the value of medical data. However, medical data contain a large amount of sensitive information, and the inappropriate handling of data can lead to the leakage of personal privacy. Thus, both publishing data and training data in machine learning may reveal the privacy of patients. To address the above issue, we propose two effective approaches. One combines a differential privacy and decision tree (DPDT) approach to provide strong privacy guarante… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
25
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 42 publications
(30 citation statements)
references
References 39 publications
0
25
0
1
Order By: Relevance
“…[127], [149], [150] Highly practical, as no computational overhead is involved because no encryption is performed.…”
Section: Differential Privacymentioning
confidence: 99%
“…[127], [149], [150] Highly practical, as no computational overhead is involved because no encryption is performed.…”
Section: Differential Privacymentioning
confidence: 99%
“…The time taken to upload the dataset into the cloud is given with respect to increase in records and the increase in number of data analysts respectively. The previous work [19] using the CART method and deep neural network have reached nearer to DP-FCNN, where their difference is 5ms and 5s in their runtime for data processing and query processing. In contrast, all the other papers have higher runtime than this due to the absence of machine learning algorithms and use of poor parameters for preserving privacy.…”
Section: ) Scalabilitymentioning
confidence: 97%
“…The privacy level was estimated from the trust distance, measured from the Markov process. Then, in machine learning algorithms two approaches were developed to ensure privacy based on the estimation of weight using classification and regression tree (CART) method [19]. The noise to be added is computed using the attribute weights by decision tree.…”
Section: Problem Descriptionmentioning
confidence: 99%
See 1 more Smart Citation
“…In the digital age, balancing public safety and personal privacy is still enormously challenging [38,39,40]. With the rapid spread of COVID-19, unidentified aggregate information has little value.…”
Section: A Threat Analysis Using the Stride Threat Modelmentioning
confidence: 99%