2020
DOI: 10.48550/arxiv.2006.05535
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Locally Private Graph Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(10 citation statements)
references
References 0 publications
0
10
0
Order By: Relevance
“…Based on the privacy budget and the mechanism to be protected, certain levels of Gaussian noise or Laplace noise will be injected to achieve a differentially private mechanism. Recently, various differential-privacy preserving deep learning methods [1,8,141,152] are proposed to protect the training data privacy. For instance, NoisySGD [1] adds noises to the gradients during model training so the trained model parameters will not leak training data with certain guarantee.…”
Section: Differentialmentioning
confidence: 99%
See 3 more Smart Citations
“…Based on the privacy budget and the mechanism to be protected, certain levels of Gaussian noise or Laplace noise will be injected to achieve a differentially private mechanism. Recently, various differential-privacy preserving deep learning methods [1,8,141,152] are proposed to protect the training data privacy. For instance, NoisySGD [1] adds noises to the gradients during model training so the trained model parameters will not leak training data with certain guarantee.…”
Section: Differentialmentioning
confidence: 99%
“…In differential privacy, a trusted curator will be required to apply calibrated noise to produce DP. To handle the situation of untrusted curator, local differential privacy methods [8,152] that perturbs users' data locally before uploading to the central server are also investigated for privacy protection.…”
Section: Differentialmentioning
confidence: 99%
See 2 more Smart Citations
“…Existing works on privacy-preserving GNNs mainly focussed on a distributed setting in which the node feature values or/and labels are assumed to be private and distributed among multiple distrusting parties [20,21,25,27]. Sajadmanesh and Gatica-Perez [20] assumed that only the node features are sensitive while the graph structure is publicly available and developed a local differential privacy mechanism to tackle the problem of node-level privacy.…”
Section: Related Workmentioning
confidence: 99%