2003
DOI: 10.1093/imanum/23.4.581
|View full text |Cite
|
Sign up to set email alerts
|

A note on an SOR-like method for augmented systems

Abstract: Golub et al. (2001, BIT, 41, 71-85) gave a generalized successive over-relaxation method for the augmented systems. In this paper, the connection between the SOR-like method and the preconditioned conjugate gradient (PCG) method for the augmented systems is investigated. It is shown that the PCG method is at least as accurate (fast) as the SOR-like method. Numerical examples demonstrate that the PCG method is much faster than the SOR-like method.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2005
2005
2015
2015

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(10 citation statements)
references
References 6 publications
0
10
0
Order By: Relevance
“…Theorem 4.1 shows that the GSOR method, when the optimal parameters are employed, has the same asymptotic convergence rate as the PCG method studied in [22] (here, we have assumed that x PCG := A −1 (b −By PCG ) is computed in exact arithmetic, see Method 3.1 in Section 3). However, we notice that this asymptotic convergence rate is global for the optimal GSOR method in the sense that it is an estimate with respect to the whole variable, but is partial for the PCG method in the sense that it is only an estimate with respect to the second block of the whole variable.…”
Section: Theorem 41 Consider the Gsor Method Letmentioning
confidence: 97%
See 2 more Smart Citations
“…Theorem 4.1 shows that the GSOR method, when the optimal parameters are employed, has the same asymptotic convergence rate as the PCG method studied in [22] (here, we have assumed that x PCG := A −1 (b −By PCG ) is computed in exact arithmetic, see Method 3.1 in Section 3). However, we notice that this asymptotic convergence rate is global for the optimal GSOR method in the sense that it is an estimate with respect to the whole variable, but is partial for the PCG method in the sense that it is only an estimate with respect to the second block of the whole variable.…”
Section: Theorem 41 Consider the Gsor Method Letmentioning
confidence: 97%
“…Although the above PCG method is interesting and valuable the claims for its convergence rate in Theorems 1 and 2 in [22] are flawed because the error measures on the {x (k) PCG } and {y (k) PCG } components are different. In addition, the authors fail to mention that, compared with SOR-like and GSOR, PCG requires two extra inner products per iteration step and needs to compute the initial residual vector r (0) := B T A −1 (By (0) PCG − b) + q at the starting as well as to recover the final approximate vector x PCG := A −1 (b − By PCG ) at the terminating.…”
Section: Methods 31 (The Pcg Method)mentioning
confidence: 99%
See 1 more Smart Citation
“…The above assumptions are necessary to ensure the existence of a unique solution in the above augmented linear system (1.1). The augmented linear system (1.1) arises in many different applications of scientific computing such as weighted least-squares problems [18,22,27], finite element discretization of the Navier-Stokes equations [13][14][15], constrained optimization [25], equilibrium system and saddle point problems [7,19], etc.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, the augmented linear system has attracted more and more researchers and various kinds of iteration methods have been established and discussed. For example, the Uzawa-type methods [13,15], the preconditioned Krylov subspace methods [7,9,18], the relaxation methods [8,10,14,17], and the Hermitian and skew-Hermitian splitting methods [3][4][5][6]12], etc. Moreover, the singular augmented linear system has been specially studied in [11,19].…”
Section: Introductionmentioning
confidence: 99%