2020
DOI: 10.1109/tifs.2020.2988575
|View full text |Cite
|
Sign up to set email alerts
|

Federated Learning With Differential Privacy: Algorithms and Performance Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

3
434
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 1,219 publications
(536 citation statements)
references
References 23 publications
3
434
0
Order By: Relevance
“…Federated learning can be integrated with differential privacy techniques [234] to establish a stronger standard for individual privacy in modelling tasks on sensitive FRM datasets [235]. Differential privacy has been widely accepted by both researchers and practitioners as the de facto standard of privacy [236].…”
Section: Federated Learningmentioning
confidence: 99%
“…Federated learning can be integrated with differential privacy techniques [234] to establish a stronger standard for individual privacy in modelling tasks on sensitive FRM datasets [235]. Differential privacy has been widely accepted by both researchers and practitioners as the de facto standard of privacy [236].…”
Section: Federated Learningmentioning
confidence: 99%
“…• FL is a subset of MPC from the m-ary functionality point of view; • If FL attains the security in the simulation-based framework, then the resulting SFL is a subset of SMPC. Since there are other known techniques such as the homomorphic encryption (HE) and the dierential privacy (DP) to attain the privacy of federated learning procedures [6], [9] and we are not clear whether SFL is a subset of HE or DP within the correspondent framework, we thus leave these interesting problems to the research community.…”
Section: B Our Contributionmentioning
confidence: 99%
“…The datasets dened in the FL framework can be categorized as horizontal, vertical and hybrid types. Roughly speaking, in the horizontal FL, the feature spaces of datasets among dierent organizations (data owners) are same but not overlapped over the sample spaces [5]; in the vertical FL, the sample spaces of datasets among dierent organizations are same but not overlapped over the feature spaces [6], [7]; in the hybrid FL, both feature spaces and sample spaces of dierent organizations are overlapped [8], [9]. We refer to the reader [10] [14] and the references therein for more details.…”
Section: Introductionmentioning
confidence: 99%
“…Especially, a variety of IoT applications call for data mining and learning securely and reliably in distributed systems; it represents a vision in which the Internet extends into the real world while embracing everyday objects. This new trend arises from synergically merging IoT and distributed computing [1]- [2].…”
Section: Introductionmentioning
confidence: 99%
“…In order to solve this issue, it is necessary to add artificial noise to preventing personal information leakage; one prominent example of which is the concept of differential privacy (DP). The major goal of DP in the FL process is to ensure that a learned model does not reveal whether a certain data point was used during training [2], [4], [7].…”
Section: Introductionmentioning
confidence: 99%