This paper considers optimization of smooth nonconvex functionals in smooth infinite dimensional spaces. A Hölder gradient descent algorithm is first proposed for finding approximate first-order points of regularized polynomial functionals. This method is then applied to analyze the evaluation complexity of an adaptive regularization method which searches for approximate first-order points of functionals with β-Hölder continuous derivatives. It is shown that finding an ǫ-approximate first-order point requires at most O(ǫ − p+β p+β−1 ) evaluations of the functional and its first p derivatives.
A second-order algorithm is proposed for minimizing smooth nonconvex functions that alternates between regularized Newton and negative curvature steps. In most cases, the Hessian matrix is regularized with the square root of the current gradient and an additional term taking moderate negative curvature into account, a negative curvature step being taken only exceptionnally. As a consequence, the proposed method only requires the solution of a single linear system at nearly all iterations. We establish that at most O | log | −3/2 evaluations of the problem's objective function and derivatives are needed for this algorithm to obtain an -approximate first-order minimizer, and at most O | log | −3 to obtain a second-order one. Initial numerical experiments with two variants of the new method are finally presented.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.