2019
DOI: 10.1137/18m1167498
|View full text |Cite
|
Sign up to set email alerts
|

On the Quadratic Convergence of the Cubic Regularization Method under a Local Error Bound Condition

Abstract: In this paper we consider the cubic regularization (CR) method for minimizing a twice continuously differentiable function. While the CR method is widely recognized as a globally convergent variant of Newton's method with superior iteration complexity, existing results on its local quadratic convergence require a stringent non-degeneracy condition. We prove that under a local error bound (EB) condition, which is much weaker a requirement than the existing non-degeneracy condition, the sequence of iterates gene… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

6
24
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8

Relationship

2
6

Authors

Journals

citations
Cited by 21 publications
(30 citation statements)
references
References 32 publications
(87 reference statements)
6
24
0
Order By: Relevance
“…Thanks to the use of high-order models, our methods are expected to attain a fast local convergence rate, especially for growing q. The results reported here are inspired by [44] and extend the analysis proposed therein.…”
Section: Local Convergencesupporting
confidence: 71%
See 2 more Smart Citations
“…Thanks to the use of high-order models, our methods are expected to attain a fast local convergence rate, especially for growing q. The results reported here are inspired by [44] and extend the analysis proposed therein.…”
Section: Local Convergencesupporting
confidence: 71%
“…Moreover we establish local convergence results towards second-order stationary points, that are not present neither in [6] nor in [21]. These results not only generalize those in [44], that are valid only for q = 2, but also apply to the one level methods in [6]. From a practical point of view, we implemented the methods of the family corresponding to q = 2 (which represents a multilevel version of the well-known adaptive regularization by cubics) and to q = 3.…”
supporting
confidence: 67%
See 1 more Smart Citation
“…By analyzing the optimization geometry, recent works [4,20,30,36,43] have shown that many local search algorithms with either an appropriate initialization or a random initialization can provably solve the low-rank matrix recovery problem (1.2) when the measurement operator A satisfies the RIP. In particular, gradient descent with an appropriate initialization is shown to converge to a global optimum at a linear rate [43,52], while quadratic convergence is established for the cubic regularization method [48]. Key to these results is certain error bound conditions, which elucidate the regularity properties of the underlying optimization problem.…”
Section: Relatedmentioning
confidence: 99%
“…In the convex setting, some studies have shown the potential of variants of regularized Newton [48,81] and semismooth Newton methods [50,82] under a local error bound. In the smooth nonconvex setting, there are many works relying on Levenberg-Marquardt [3,4,36,89], cubic regularization [91], and regularized Newton [85] methods under variants of local error bounds and Hölder metric subregularity.…”
Section: Subsequential Convergencementioning
confidence: 99%