2021
DOI: 10.1088/1361-6420/abf5bc
|View full text |Cite
|
Sign up to set email alerts
|

Optimal-order convergence of Nesterov acceleration for linear ill-posed problems*

Abstract: We show that Nesterov acceleration is an optimal-order iterative regularization method for linear ill-posed problems provided that a parameter is chosen accordingly to the smoothness of the solution. This result is proven both for an a priori stopping rule and for the discrepancy principle under Hölder source conditions. Furthermore, some converse results and logarithmic rates are verified. The essential tool to obtain these results is a representation of the residual polynomials via Gegenbauer polynomials.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(5 citation statements)
references
References 16 publications
0
5
0
Order By: Relevance
“…Inspired by this, we investigate in this paper how AA(m) for linear problems, with β (k) given by (1.8), relates to Krylov methods. Following [11,13,16], this is easy to see for AA(1) applied to (1.11), as we now explain. Given x 0 , let x 1 = q(x 0 ).…”
Section: Introductionmentioning
confidence: 77%
See 1 more Smart Citation
“…Inspired by this, we investigate in this paper how AA(m) for linear problems, with β (k) given by (1.8), relates to Krylov methods. Following [11,13,16], this is easy to see for AA(1) applied to (1.11), as we now explain. Given x 0 , let x 1 = q(x 0 ).…”
Section: Introductionmentioning
confidence: 77%
“…In previous work, [11,13,16] have interpreted Nesterov acceleration, which is a form of AA(1) with a prescribed sequence of acceleration coefficients β k , as a Krylov method.…”
Section: Introductionmentioning
confidence: 99%
“…We remark that Nesterov's acceleration strategy was first proposed in [29] to accelerate gradient type regularization method for linear as well as nonlinear illposed problems in Banach spaces and various numerical results were reported which demonstrate the striking performance; see also [27,28,35,39,45] for further numerical simulations. Although we have proved in Theorem 3.8 a convergence rate result for the method (3.18) under an a priori stopping rule, the regularization property of the method under the discrepancy principle is not yet established for general strongly convex R. However, when X is a Hilbert space and R(x) = x 2 /2, the regularization property of the corresponding method has been established in [35,39] based on a general acceleration framework in [21] using orthogonal polynomials; in particular it was observed in [35] that the parameter α plays an interesting role in deriving order optimal convergence rates. For an analysis of Nesterov's acceleration for nonlinear ill-posed problems in Hilbert spaces, one may refer to [28].…”
Section: 3mentioning
confidence: 92%
“…In 2017, Neubauer considered the Nesterov scheme with a stopping rule (the discrepancy principle) and proved convergence rates under standard source conditions [39]. In 2021 Kindermann revisited the Nesterov scheme for linear ill-posed problems [30] and proved that this explicit two-point method is an optimal-order iterative regularization method. In 2017 Hubmer and Ramlau [26] considered the Nesterov scheme (with discrepancy principle) for nonlinear ill-posed problems (under the Scherzer condition [17, equation (11.6)]).…”
Section: Two-point Iterationsmentioning
confidence: 99%