2004
DOI: 10.1137/s0895479801398608
|View full text |Cite
|
Sign up to set email alerts
|

Convergence of Restarted Krylov Subspaces to Invariant Subspaces

Abstract: The performance of Krylov subspace eigenvalue algorithms for large matrices can be measured by the angle between a desired invariant subspace and the Krylov subspace. We develop general bounds for this convergence that include the effects of polynomial restarting and impose no restrictions concerning the diagonalizability of the matrix or its degree of nonnormality. Associated with a desired set of eigenvalues is a maximum "reachable invariant subspace" that can be developed from the given starting vector. Con… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
55
0

Year Published

2005
2005
2015
2015

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 38 publications
(56 citation statements)
references
References 34 publications
1
55
0
Order By: Relevance
“…be interpreted as the opening between R(Q) and the "closest" subspace of R(Y ) of dimension k. This one-sided gap was used in [4], [5].…”
Section: Projections Onto R(y ) and R(q) Respectively And Let P Q Bmentioning
confidence: 99%
See 1 more Smart Citation
“…be interpreted as the opening between R(Q) and the "closest" subspace of R(Y ) of dimension k. This one-sided gap was used in [4], [5].…”
Section: Projections Onto R(y ) and R(q) Respectively And Let P Q Bmentioning
confidence: 99%
“…We mention that in [4], [5], invariant subspaces have been used in the convergence analysis of Krylov subspace methods in the context of eigenvalue problems. We also point out that using invariant subspaces and not the spectrum of A directly allows us to treat diagonalizable and nondiagonalizable matrices in the same manner.…”
Section: Introductionmentioning
confidence: 99%
“…The resulting analysis incorporates a different polynomial approximation problem. In typical situations the new bounds are weaker at early iterations, though the asymptotic convergence rate we establish is never worse than that obtained in [3]. In certain situations where the desired eigenvalues are ill-conditioned, these new bounds improve the earlier analysis.…”
mentioning
confidence: 55%
“…In this section we characterize those good invariant subspaces (within X g ) that can be captured with Krylov subspaces, adapting the discussion from [3]. Throughout we assume that the starting vector v 1 is fixed.…”
Section: Decomposition Of Krylov Spaces With Respect To Eigenspaces Ofmentioning
confidence: 99%
See 1 more Smart Citation