2023
DOI: 10.31857/s0132347423060079
|View full text |Cite
|
Sign up to set email alerts
|

Gradient-Free Algorithms for Solving Stochastic Saddle Optimization Problems With the Polyak–loyasievich Condition

S. I. Sadykov,
A. V. Lobanov,
A. M. Raigorodskii

Abstract: This paper focuses on solving a subclass of a stochastic nonconvex-concave black box optimization problem with a saddle point that satisfies the Polyak–Loyasievich condition. To solve such a problem, we provide the first, to our knowledge, gradient-free algorithm, the approach to which is based on applying a gradient approximation (kernel approximation) to the oracle-shifted stochastic gradient descent algorithm. We present theoretical estimates that guarantee a global linear rate of convergence to the desired… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 11 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?