2023
DOI: 10.1109/tcns.2022.3219765
|View full text |Cite
|
Sign up to set email alerts
|

Privacy-Preserving Distributed Online Stochastic Optimization With Time-Varying Distributions

Abstract: Differentially private distributed stochastic optimization has become a hot topic due to the urgent need of privacy protection in distributed stochastic optimization. In this paper, two-time scale stochastic approximation-type algorithms for differentially private distributed stochastic optimization with time-varying sample sizes are proposed using gradient-and output-perturbation methods. For both gradient-and output-perturbation cases, the convergence of the algorithm and differential privacy with a finite c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
references
References 72 publications
0
0
0
Order By: Relevance