2021
DOI: 10.1093/imanum/draa095
|View full text |Cite
|
Sign up to set email alerts
|

Anderson acceleration for contractive and noncontractive operators

Abstract: A one-step analysis of Anderson acceleration with general algorithmic depths is presented. The resulting residual bounds within both contractive and noncontractive settings reveal the balance between the contributions from the higher and lower order terms, which are both dependent on the success of the optimization problem solved at each step of the algorithm. The new residual bounds show the additional terms introduced by the extrapolation produce terms that are of a higher order than was previously understoo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

1
70
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 40 publications
(74 citation statements)
references
References 29 publications
1
70
0
Order By: Relevance
“…Our analysis is primarily based on linear algebra and on properties of orthogonal and oblique projectors in particular, to gain insight into the effect of the leastsquares minimization in Anderson acceleration. Compared to the main conclusion in [17,Theorem 5.5], our result gives the same convergence factor for the linear term of the nonlinear residual, but we have a simpler upper bound for the higher-order terms that does not exhibit an explicit exponential growth with the acceleration depth and does not involve the squares of the most recent nonlinear residual norm if the minimization at the current step has the largest possible gain. More importantly, our analysis can be extended without difficulty to study the effects of an approximate evaluation of the fixed point mapping for the inexact method by exploring the impact of small perturbations in the relevant projectors, whereas it seems less clear how this idea could be realized based on the analysis in [17].…”
supporting
confidence: 61%
See 4 more Smart Citations
“…Our analysis is primarily based on linear algebra and on properties of orthogonal and oblique projectors in particular, to gain insight into the effect of the leastsquares minimization in Anderson acceleration. Compared to the main conclusion in [17,Theorem 5.5], our result gives the same convergence factor for the linear term of the nonlinear residual, but we have a simpler upper bound for the higher-order terms that does not exhibit an explicit exponential growth with the acceleration depth and does not involve the squares of the most recent nonlinear residual norm if the minimization at the current step has the largest possible gain. More importantly, our analysis can be extended without difficulty to study the effects of an approximate evaluation of the fixed point mapping for the inexact method by exploring the impact of small perturbations in the relevant projectors, whereas it seems less clear how this idea could be realized based on the analysis in [17].…”
supporting
confidence: 61%
“…The rest of the paper is structured as follows: In Section 2, we consider the fixed point iteration x k+1 = g(x k ) and state assumptions on the mapping g; we also outline Anderson acceleration and then present a few preliminary results for the subsequent analysis. In Section 3, we give a one-step convergence analysis of (exact) Anderson acceleration based on projectors and angles between subspaces, showing a result similar to that in [17] with a new bound for the higher-order terms. In Section 4, we provide an analysis of inexact Anderson acceleration, where each update g(x k ) is allowed to be evaluated with an error proportional to the residual norm w k = g(x k ) − x k without obviously affecting the convergence rate of the algorithm.…”
mentioning
confidence: 82%
See 3 more Smart Citations