2002
DOI: 10.1137/s1064827500381239
|View full text |Cite
|
Sign up to set email alerts
|

Residual and Backward Error Bounds in Minimum Residual Krylov Subspace Methods

Abstract: Paige and Z. Strakoš, Bounds for the least squares distance using scaled total least squares, Numer. Math., to appear] revealing upper and lower bounds on the residual norm of any linear least squares (LS) problem were derived in terms of the total least squares (TLS) correction of the corresponding scaled TLS problem. In this paper theoretical results of [C. Paige and Z. Strakoš, Bounds for the least squares distance using scaled total least squares, Numer. Math., to appear] are extended to the GMRES context.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
46
0

Year Published

2002
2002
2020
2020

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 30 publications
(47 citation statements)
references
References 26 publications
1
46
0
Order By: Relevance
“…Section 6 introduces the key step used to prove convergence of these iterations. In section 7.1 we prove the backward stability of the MGS algorithm applied to solving linear least squares problems of the form required by the MGS-GMRES algorithm, and in section 7.2 we show how loss of orthogonality is directly related to new normwise relative backward errors of a sequence of different least squares problems, supporting a conjecture on the convergence of MGS-GMRES and its loss of orthogonality; see [18]. In section 8.1 we show that at every step MGS-GMRES computes a backward stable solution for that step's linear least squares problem, and in section 8.2 we show that one of these solutions is also a backward stable solution for (1.1) in at most n+1 MGS steps.…”
mentioning
confidence: 68%
See 1 more Smart Citation
“…Section 6 introduces the key step used to prove convergence of these iterations. In section 7.1 we prove the backward stability of the MGS algorithm applied to solving linear least squares problems of the form required by the MGS-GMRES algorithm, and in section 7.2 we show how loss of orthogonality is directly related to new normwise relative backward errors of a sequence of different least squares problems, supporting a conjecture on the convergence of MGS-GMRES and its loss of orthogonality; see [18]. In section 8.1 we show that at every step MGS-GMRES computes a backward stable solution for that step's linear least squares problem, and in section 8.2 we show that one of these solutions is also a backward stable solution for (1.1) in at most n+1 MGS steps.…”
mentioning
confidence: 68%
“…[25]. It was used in [8] and [1], and in particular in [18], in which we outlined another possible approach to backward stability analysis of MGS-GMRES. Here we have chosen a different way of proving the backward stability result, and this follows the spirit of [5] and [10].…”
Section: Mgs Applied Tomentioning
confidence: 99%
“…While (17) was in [8, Corollary 5.1] derived from (9), it is not always true that the latter is tighter. When δ(γ) ≈ 1 and r ≈ b , it is possible for the upper bound in (17) to be smaller than that in (9). But in this case…”
Section: Corollary 2 Under the Same Conditions And Assumptions As In mentioning
confidence: 96%
“…The results presented here have been successfully applied outside the Errors-in-Variables Modeling field for analysis of convergence and numerical stability of Krylov subspace methods, see [9], [6].…”
Section: Tightness Parametermentioning
confidence: 99%
“…which is the default criterion in ILUPACK [11], based on the analysis in [3]; see also [33]. Since the results were very similar and the full test can not be done at each iteration we only show results using the relative residual stopping criterion (6.1).…”
Section: Choosing Parametersmentioning
confidence: 99%