2023
DOI: 10.48550/arxiv.2302.10065
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Yet another fast variant of Newton's method for nonconvex optimization

Abstract: A second-order algorithm is proposed for minimizing smooth nonconvex functions that alternates between regularized Newton and negative curvature steps. In most cases, the Hessian matrix is regularized with the square root of the current gradient and an additional term taking moderate negative curvature into account, a negative curvature step being taken only exceptionnally. As a consequence, the proposed method only requires the solution of a single linear system at nearly all iterations. We establish that at … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 39 publications
(89 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?