2018
DOI: 10.1007/s00245-018-9528-3
|View full text |Cite
|
Sign up to set email alerts
|

Gradient Methods on Strongly Convex Feasible Sets and Optimal Control of Affine Systems

Abstract: The paper presents new results about convergence of the gradient projection and the conditional gradient methods for abstract minimization problems on strongly convex sets. In particular, linear convergence is proved, although the objective functional does not need to be convex. Such problems arise, in particular, when a recently developed discretization technique is applied to optimal control problems which are affine with respect to the control. This discretization technique has the advantage to provide high… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 23 publications
0
2
0
Order By: Relevance
“…Another sufficient assumption for nonempty X ⋆ of VIs is that X is sufficiently strongly con-vex. This condition has recently been used to show fast convergence of mirror descent and conditional gradient descent (Garber and Hazan, 2015;Veliov and Vuong, 2017). We leave this discussion to Appendix B.…”
Section: Ep and VI Perspectivesmentioning
confidence: 98%
“…Another sufficient assumption for nonempty X ⋆ of VIs is that X is sufficiently strongly con-vex. This condition has recently been used to show fast convergence of mirror descent and conditional gradient descent (Garber and Hazan, 2015;Veliov and Vuong, 2017). We leave this discussion to Appendix B.…”
Section: Ep and VI Perspectivesmentioning
confidence: 98%
“…Their analysis rely on assumptions that combine scaling inequalities for strongly convex feasible sets and an affine-invariant characterization of smoothness [Jag13]. Finally, strong convexity for sets was also used outside of projection-free optimization techniques in, e.g., [VV20,Bac20].…”
Section: Introductionmentioning
confidence: 99%