2023
DOI: 10.1109/tcns.2023.3242043
|View full text |Cite
|
Sign up to set email alerts
|

Decentralized Asynchronous Nonconvex Stochastic Optimization on Directed Graphs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 14 publications
0
1
0
Order By: Relevance
“…Furthermore, a distributed derivative-free GT algorithm was proposed in [27], where the zero-order stochastic GT estimator is used in the iterations. In [28], the distributed stochastic GT algorithm was also extended to non-convex optimization problems over unbalanced networks. More recently, several variance reduction techniques were employed in distributed GT algorithm to reduce the variances of the stochastic gradient estimators [4], [19], [29].…”
Section: Introductionmentioning
confidence: 99%
“…Furthermore, a distributed derivative-free GT algorithm was proposed in [27], where the zero-order stochastic GT estimator is used in the iterations. In [28], the distributed stochastic GT algorithm was also extended to non-convex optimization problems over unbalanced networks. More recently, several variance reduction techniques were employed in distributed GT algorithm to reduce the variances of the stochastic gradient estimators [4], [19], [29].…”
Section: Introductionmentioning
confidence: 99%