2012
DOI: 10.1007/s11075-012-9605-7
|View full text |Cite
|
Sign up to set email alerts
|

Banded target matrices and recursive FSAI for parallel preconditioning

Abstract: Abstract. In this paper we propose a parallel preconditioner for the CG solver based on successive applications of the FSAI preconditioner. We first compute an FSAI factor Gout for coefficient matrix A, and then another FSAI preconditioner is computed for either the preconditioned matrix S = GoutAG T out or a sparse approximation of S. This process can be iterated to obtain a sequence of triangular factors whose product forms the final preconditioner. Numerical results onto large SPD matrices arising from geom… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 22 publications
0
5
0
Order By: Relevance
“…The final G implicitly inherits a large number of nonzeroes, even though each G k is quite sparse. This idea was proposed by Kolotilina and Yeremin [1999] and later developed by Wang and Zhang [2003] and Bergamaschi and Martinez [2012]. It looks quite similar to multilevel preconditioning (see e.g., Saad and Suchomel [2002] and Janna et al [2009]).…”
Section: Recurrent Fsai Computationmentioning
confidence: 99%
See 1 more Smart Citation
“…The final G implicitly inherits a large number of nonzeroes, even though each G k is quite sparse. This idea was proposed by Kolotilina and Yeremin [1999] and later developed by Wang and Zhang [2003] and Bergamaschi and Martinez [2012]. It looks quite similar to multilevel preconditioning (see e.g., Saad and Suchomel [2002] and Janna et al [2009]).…”
Section: Recurrent Fsai Computationmentioning
confidence: 99%
“…It is based upon an algorithm that for each preconditioner row adaptively chooses the most significant terms to be retained. Another interesting strategy consists of building the preconditioner recursively, hence computing a dense and high-quality FSAI as the product of a few sparse factors [Bergamaschi and Martinez 2012].…”
Section: Introductionmentioning
confidence: 99%
“…The initial Newton vector is obtained after a small number of iterations of a conjugate gradient procedure for the minimization of the Rayleigh quotient (DACG, [16]). As the initial preconditioner we choose RFSAI [17], which is built by recursively applying the (factorized sparse approximate inverse) FSAI preconditioner developed in [18]. We elected a factorized approximate inverse preconditioner (AIP) since it is more naturally parallelizable than preconditioners based on ILU factorizations.…”
Section: Journal Of Applied Mathematicsmentioning
confidence: 99%
“…Choice of the Initial Preconditioner. Following the developments in [17], we propose an implicit enlargement of the sparsity pattern using a banded target matrix ; the lower factor of the FSAI preconditioner is obtained by minimizing ‖ − ‖ over the set of matrices having a fixed sparsity pattern. Denoting with out the result of this minimization, we compute explicitly the preconditioned matrix = out out and then evaluate a second FSAI factor in for .…”
Section: Bfgs Sequence Of Preconditionersmentioning
confidence: 99%
See 1 more Smart Citation