2019
DOI: 10.1134/s0005117919080095
|View full text |Cite
|
Sign up to set email alerts
|

Accelerated Gradient-Free Optimization Methods with a Non-Euclidean Proximal Operator

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 6 publications
0
3
0
Order By: Relevance
“…It seems that one can do it analogously. 5 Generalization of the results of [5,10,17] and [1,13] for the gradient-free saddle-point set-up is more challenging. Also, based on combinations of ideas from [1,11] it'd be interesting to develop a mixed method with a gradient oracle for x (outer minimization) and a gradientfree oracle for y (inner maximization).…”
Section: Possible Generalizationsmentioning
confidence: 99%
“…It seems that one can do it analogously. 5 Generalization of the results of [5,10,17] and [1,13] for the gradient-free saddle-point set-up is more challenging. Also, based on combinations of ideas from [1,11] it'd be interesting to develop a mixed method with a gradient oracle for x (outer minimization) and a gradientfree oracle for y (inner maximization).…”
Section: Possible Generalizationsmentioning
confidence: 99%
“…Mainly driven by applications in imaging and machine learning, the idea of acceleration turned out to be very productive in the last 20 years. During this time span it has been extended to composite optimization [54,133], general proximal setups [67,26], stochastic optimization problems [134,135,136,137,138,139,140], optimization with inexact oracle [141,142,138,139,143,144,145,57], variance reduction methods [148,149,150,151,152,153], alternating minimization methods [154,155], random coordinate descent [156,157,158,159,160,161,162,163,164,154] and other randomized methods such as randomized derivative-free methods [165,164,166,167] and randomized directional search [164,168,169],…”
Section: Accelerated Methodsmentioning
confidence: 99%
“…In [Nesterov and Spokoiny, 2017], the authors consider random derivative-free methods and provide them with some complexity bounds for different classes of convex optimization problems as well as accelerated methods for smooth convex derivative-free optimization. In [Vorontsova et al, 2019], the authors propose an accelerated gradient-free method with a non-Euclidean proximal operator. Paper [Gorbunov et al, 2022] describes an accelerated method for smooth stochastic derivative-free optimization with two-point feedback.…”
Section: Introductionmentioning
confidence: 99%