2022
DOI: 10.1109/tac.2022.3207866
|View full text |Cite
|
Sign up to set email alerts
|

Distributed Optimization Over Time-Varying Graphs With Imperfect Sharing of Information

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0
1

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 22 publications
0
7
0
1
Order By: Relevance
“…This assumption is specific to the imperfect information sharing setup and is considered recently in [26], [29], [30].…”
Section: Assumptionsmentioning
confidence: 99%
“…This assumption is specific to the imperfect information sharing setup and is considered recently in [26], [29], [30].…”
Section: Assumptionsmentioning
confidence: 99%
“…Based on this, the convergence rate of the algorithm is given in Theorem 4, and the related results are not provided for distributed stochastic optimization even when no privacy protection is considered. Note that the convergence rate for distributed optimization with non-vanishing noises is studied in [29], [32], where σ k " Oppk `a2 q η q, η " 0. Then, the convergence rate studied in this paper is nontrivial and more general than the one in [29], [32].…”
Section: B Convergence Analysismentioning
confidence: 99%
“…Distributed (stochastic) optimization has been widely used in various fields, such as big data analytics, finance, and distributed learning [25]- [32]. At present, there are many important techniques to solve distributed stochastic optimization, such as stochastic approximation [29]- [32] and time-varying sample-size.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations