2021
DOI: 10.48550/arxiv.2104.02564
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Hölder Gradient Descent and Adaptive Regularization Methods in Banach Spaces for First-Order Points

Abstract: This paper considers optimization of smooth nonconvex functionals in smooth infinite dimensional spaces. A Hölder gradient descent algorithm is first proposed for finding approximate first-order points of regularized polynomial functionals. This method is then applied to analyze the evaluation complexity of an adaptive regularization method which searches for approximate first-order points of functionals with β-Hölder continuous derivatives. It is shown that finding an ǫ-approximate first-order point requires … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 25 publications
0
1
0
Order By: Relevance
“…A third possibility is to consider optimization in infinite-dimensional smooth Banach spaces, a development presented for the standard framework in [19]. This requires specific techniques for computing the step and a careful handling of the norms involved.…”
Section: Discussionmentioning
confidence: 99%
“…A third possibility is to consider optimization in infinite-dimensional smooth Banach spaces, a development presented for the standard framework in [19]. This requires specific techniques for computing the step and a careful handling of the norms involved.…”
Section: Discussionmentioning
confidence: 99%