2022
DOI: 10.1007/s43674-022-00049-5
|View full text |Cite
|
Sign up to set email alerts
|

Differentially private transferrable deep learning with membership-mappings

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
24
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(24 citation statements)
references
References 34 publications
0
24
0
Order By: Relevance
“…In contrast, the current study explores geometrically inspired kernel affine hull machines. 2) The input perturbation method (where noise is added to original data to achieve (ǫ, δ)−differential privacy of any subsequent computational algorithm) was earlier considered in Kumar, 2023). However, the current study complements the input perturbation method with a transformation to mitigate the accuracy-loss issue of differential privacy.…”
Section: Kahm Compositions For Data Representation Learning and Class...mentioning
confidence: 99%
See 2 more Smart Citations
“…In contrast, the current study explores geometrically inspired kernel affine hull machines. 2) The input perturbation method (where noise is added to original data to achieve (ǫ, δ)−differential privacy of any subsequent computational algorithm) was earlier considered in Kumar, 2023). However, the current study complements the input perturbation method with a transformation to mitigate the accuracy-loss issue of differential privacy.…”
Section: Kahm Compositions For Data Representation Learning and Class...mentioning
confidence: 99%
“…Algorithm 1 Differentially private approximation of a data matrix (Kumar, 2023) Require: Data matrix Y ∈ R N ×p ; differential privacy parameters:…”
Section: An Optimal (ǫ δ)−Differentially Private Noise Adding Mechanismmentioning
confidence: 99%
See 1 more Smart Citation
“…In this study, the variational membership mappings are leveraged to build the required stochastic inverse models. Membership mappings [14,15] have been introduced as an alternative to deep neural networks in order to address the issues such as determining the optimal model structure, smaller training dataset, and iterative time-consuming nature of numerical learning algorithms [16][17][18][19][20][21][22]. A membership mapping represents data through a fuzzy set (characterized by a membership function such that the dimension of the membership function increases with an increasing data size).…”
Section: Variational Membership Mapping Bayesian Modelsmentioning
confidence: 99%
“…However, in practice, the data distributions are unknown, and thus, a way to approximate the defined measures is required. Therefore, a novel method that employs recently introduced membership mappings [14][15][16][17][18][19][20][21][22], is presented for approximating the defined privacy leakage, interpretability, and transferability measures. The method relies on inferring a variational Bayesian model that facilitates an analytical approximation of the information theoretic measures through variational optimization methodology.…”
Section: Computation Of Information Theoretic Measures Without Requir...mentioning
confidence: 99%