2023
DOI: 10.1049/cit2.12190
|View full text |Cite
|
Sign up to set email alerts
|

Federated learning privacy incentives: Reverse auctions and negotiations

Abstract: The incentive mechanism of federated learning has been a hot topic, but little research has been done on the compensation of privacy loss. To this end, this study uses the Local SGD federal learning framework and gives a theoretical analysis under the use of differential privacy protection. Based on the analysis, a multi‐attribute reverse auction model is proposed to be used for user selection as well as payment calculation for participation in federal learning. The model uses a mixture of economic and non‐eco… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 35 publications
0
0
0
Order By: Relevance
“…For deep neural network training, stochastic gradient descent (SGD) is one of the increasingly popular algorithms [23]. In SGD, the standard equation is represented as θk+1=θkλJθk, ${\theta }_{k+1}={\theta }_{k}-\lambda {\left(\frac{\partial J}{\partial \theta }\right)}_{k},$ where k represents the iteration step, and λ is denoted as the learning rate.…”
Section: Optimizer Constructionmentioning
confidence: 99%
See 1 more Smart Citation
“…For deep neural network training, stochastic gradient descent (SGD) is one of the increasingly popular algorithms [23]. In SGD, the standard equation is represented as θk+1=θkλJθk, ${\theta }_{k+1}={\theta }_{k}-\lambda {\left(\frac{\partial J}{\partial \theta }\right)}_{k},$ where k represents the iteration step, and λ is denoted as the learning rate.…”
Section: Optimizer Constructionmentioning
confidence: 99%
“…For deep neural network training, stochastic gradient descent (SGD) is one of the increasingly popular algorithms [23]. In SGD, the standard equation is represented as…”
Section: Optimizer Constructionmentioning
confidence: 99%