2021
DOI: 10.1109/ojcs.2021.3099108
|View full text |Cite
|
Sign up to set email alerts
|

Concentrated Differentially Private Federated Learning With Performance Analysis

Abstract: Federated learning engages a set of edge devices to collaboratively train a common model without sharing their local data and has advantage in user privacy over traditional cloud-based learning approaches. However, recent model inversion attacks and membership inference attacks have demonstrated that shared model updates during the interactive training process could still leak sensitive user information. Thus, it is desirable to provide rigorous differential privacy (DP) guarantee in federated learning. The ma… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(8 citation statements)
references
References 26 publications
0
8
0
Order By: Relevance
“…A vehicle will be connected to an RSU as it enters the area of that RSU, and then the data would be transmitted to MECs. The DPFL-F2IDS is defined by FedAvg ( 14 ) and a related problem definition for data sharing in IoV scenarios ( 22 ). Vehicles V = { V 1 , V 2 , . . . , V K } are designated as clients in federated learning while the MEC U = { u 1 , u 2 , . . . , u m } servers are regarded as servers.…”
Section: Preliminaries Of Methods and Problem Definitionmentioning
confidence: 99%
See 1 more Smart Citation
“…A vehicle will be connected to an RSU as it enters the area of that RSU, and then the data would be transmitted to MECs. The DPFL-F2IDS is defined by FedAvg ( 14 ) and a related problem definition for data sharing in IoV scenarios ( 22 ). Vehicles V = { V 1 , V 2 , . . . , V K } are designated as clients in federated learning while the MEC U = { u 1 , u 2 , . . . , u m } servers are regarded as servers.…”
Section: Preliminaries Of Methods and Problem Definitionmentioning
confidence: 99%
“…Wei et al ( 21 ) also proposed that artificial noise can be added to parameters on the client side before aggregating. Hu et al ( 22 ) suggested a DPFL scheme with periodic averaging and device sampling by integrating FedAvg and a kind of local differential privacy named zero-concentration DP (zCDP). Lu et al ( 23 ) established a problem definition of resource sharing by DPFL in inter-vehicular networks.…”
Section: Literature Reviewmentioning
confidence: 99%
“…GDP [28][29][30][31][32][33][34][35][36][37] LDP 32,[38][39][40][41][42][43][44][45][46][47][48][49][50][51][52] Hybrid-DP [53][54][55][56][57][58][59][60][61][62][63][64][65][66][67] • Homomorphic encryption 56, • Trusted execution environment [94][95][96][97][98][99][100][101][102][103]…”
Section: Attack Attack Target Defense Defense Targetmentioning
confidence: 99%
“…In [189], the Skellam mechanism instead of the Gaussian mechanism is introduced and the authors explore its performance when combining it with central RDP, distributed RDP with secure encryption, respectively. The authors in [190] combine LDP with secure encryption and zCDP to achieve a good utility-privacy trade-off by adding less noise in every training iteration. It is necessary to consider the problem of trading off the information privacy between model utility.…”
Section: Dp-based Aggregationmentioning
confidence: 99%