2006
DOI: 10.1137/050633846
|View full text |Cite
|
Sign up to set email alerts
|

A Restarted Krylov Subspace Method for the Evaluation of Matrix Functions

Abstract: Abstract. We show how the Arnoldi algorithm for approximating a function of a matrix times a vector can be restarted in a manner analogous to restarted Krylov subspace methods for solving linear systems of equations. The resulting restarted algorithm reduces to other known algorithms for the reciprocal and the exponential functions. We further show that the restarted algorithm inherits the superlinear convergence property of its unrestarted counterpart for entire functions and present the results of numerical … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
164
0
1

Year Published

2006
2006
2020
2020

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 127 publications
(165 citation statements)
references
References 29 publications
0
164
0
1
Order By: Relevance
“…The few theoretical results concerning the convergence of the restarted Arnoldi method for matrix functions available in the literature [2,10] are based on approximation theory and make use of bounds for the error of interpolating polynomials for certain classes of analytic functions, using the connection between Krylov subspace methods and polynomial interpolation (as explained in, e.g., [12,30]). Here we take a different approach, using the intimate relation between Arnoldi's method for matrix functions and FOM for families of shifted linear systems [34], as well as a similar relation with the "shifted GMRES" method from [15].…”
Section: 4)mentioning
confidence: 99%
See 1 more Smart Citation
“…The few theoretical results concerning the convergence of the restarted Arnoldi method for matrix functions available in the literature [2,10] are based on approximation theory and make use of bounds for the error of interpolating polynomials for certain classes of analytic functions, using the connection between Krylov subspace methods and polynomial interpolation (as explained in, e.g., [12,30]). Here we take a different approach, using the intimate relation between Arnoldi's method for matrix functions and FOM for families of shifted linear systems [34], as well as a similar relation with the "shifted GMRES" method from [15].…”
Section: 4)mentioning
confidence: 99%
“…To overcome this problem a number of restarting approaches have been proposed in the literature, where-similarly to the techniques for linear systems-after a certain number of iterations the Arnoldi basis is discarded and a new Arnoldi cycle is started to approximate the error of the last iterate, cf. [1,2,10,11,16,23,35]. While much work has been devoted to tuning the methods towards numerical stability and efficiency, there are only few theoretical results concerning the convergence of these methods.…”
mentioning
confidence: 99%
“…For particularly challenging problems, however, an unacceptably large approximation space may be required to obtain a sastisfactory approximation. This difficulty has lead to the study of enhancement techniques that aim either at enriching the approximation space or at making the overall procedure less expensive [4], [16], [18], [22], [33], [39], [40], [41], [46], [54].…”
mentioning
confidence: 99%
“…Due to the equivalence with the Arnoldi method, the algorithm may suffer from the typical disadvantages of the Arnoldi method, for instance, the fact that the computation time per iteration increases with the iteration number. The standard approach to resolve this issue is by using restarting [7], which we leave for future work. We note that the technique we have presented is in principle applicable also to ODEs with several perturbation variables.…”
Section: An a Posteriori Error Estimate For The Krylov Approximationmentioning
confidence: 99%