2022
DOI: 10.1080/08874417.2022.2089775
|View full text |Cite
|
Sign up to set email alerts
|

Deep Learning: Differential Privacy Preservation in the Era of Big Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 66 publications
0
3
0
Order By: Relevance
“…The model's performance, indicated by high accuracy, precision, recall, and F-score metrics, demonstrates the efficacy of machine learning algorithms in interpreting complex physiological data [48]. The segmentation of stroke lesions using the UNet model [49] further illustrates the capability of deep learning methods in medical imaging analysis, providing clear delineation of affected areas for accurate diagnosis [50].…”
Section: A Interpretation Of Resultsmentioning
confidence: 99%
“…The model's performance, indicated by high accuracy, precision, recall, and F-score metrics, demonstrates the efficacy of machine learning algorithms in interpreting complex physiological data [48]. The segmentation of stroke lesions using the UNet model [49] further illustrates the capability of deep learning methods in medical imaging analysis, providing clear delineation of affected areas for accurate diagnosis [50].…”
Section: A Interpretation Of Resultsmentioning
confidence: 99%
“…Algorithm 1 gives an overview the approach in this centralized setting. Below we give details for each of the steps in Algorithm 1 except for the last one, which simply runs a learning algorithm over the provided differentially private augmented data to obtain a classifier 9 .…”
Section: B Approach For the Centralized Settingmentioning
confidence: 99%
“…Privacy Enhancing Technologies (PETs) can be a potential solution to this conundrum, and the national strategy to advance privacy-preserving data sharing and analytics recognizes that PETs can protect privacy by removing personal information, by minimizing or reducing personal data, or by preventing undesirable processing of data, while maintaining the functionality of a system 3 . However, despite the development of advanced PETs such as secure multiparty computation [1], homomorphic encryption [2], differential privacy [3], zero knowledge proofs [4], synthetic data [5], federated learning [6], and trusted execution environments [7], as well as significant development of research papers applying them to solve problems [8], [9], their practical use is still quite limited.…”
Section: Introductionmentioning
confidence: 99%
“…Anonymization techniques can be used in the publication of data sets for research purposes, enabling information sharing without compromising individual privacy. However, challenges remain in achieving an optimal balance between privacy preservation and data utility [29]- [33].…”
Section: Practical Implementations and Case Studiesmentioning
confidence: 99%