2002
DOI: 10.1137/s089547980139604x
|View full text |Cite
|
Sign up to set email alerts
|

Improved Symbolic and Numerical Factorization Algorithms for Unsymmetric Sparse Matrices

Abstract: We present algorithms for the symbolic and numerical factorization phases in the direct solution of sparse unsymmetric systems of linear equations. We have modi ed a classical symbolic factorization algorithm for unsymmetric matrices to inexpensively compute minimal elimination structures. We give an e cient algorithm to compute a near-minimal data-dependency graph that is valid irrespective of the amount of dynamic pivoting performed during numerical factorization. Finally, we describe an unsymmetric-pattern … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
34
0
1

Year Published

2002
2002
2016
2016

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 35 publications
(35 citation statements)
references
References 19 publications
(11 reference statements)
0
34
0
1
Order By: Relevance
“…A detailed comparison can be found in [12,13]. A "fail" indicates that the solver ran out of memory, e.g.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…A detailed comparison can be found in [12,13]. A "fail" indicates that the solver ran out of memory, e.g.…”
Section: Resultsmentioning
confidence: 99%
“…The best time is shown in boldface, the second best time is underlined, and the best operation count is indicated by . The last row shows the approximate smallest relative pivot threshold that yielded a residual norm close to machine precision after iterative refinement for each package ( [12,13]). …”
Section: Parallel Lu Algorithm With a Two-level Schedulingmentioning
confidence: 99%
“…Then the algorithm computes the structure of row i of U by combining the structures of earlier rows whose indices are the nonzeros in row i of L. In general, these minimal edags are often more expensive to compute than the symmetrically-pruned edags, due to the cost of transitively reducing each row. Gupta recently proposed a different algorithm for computing the minimal edags [41]. His algorithm computes the minimal structure of U by rows and of L by columns.…”
Section: Elimination Dagsmentioning
confidence: 99%
“…One such code, Davis's UMFPACK 4, uses the column elimination tree to represent control-flow dependences, and a biclique cover to represent data dependences [9]. Another code, Gupta's WSMP, uses conventional minimal edags to represent control-flow dependences, and specialized dags to represent data dependences [41]. More specifically, Gupta shows how to modify the minimal edags so they exactly represent data dependences in the unsymmetric multifrontal algorithm with no pivoting, and how to modify the edags to represent dependences in an unsymmetric multifrontal algorithm that employs delayed pivoting.…”
Section: Elimination Structures For the Unsymmetric Multifrontal Algomentioning
confidence: 99%
“…Static pivoting allows more detailed planning of the scheduling of a parallel algorithm, because the row permutation is known before the numerical factorization begins. Finally, delayed-pivoting algorithms, such as [31], perform both row and column exchanges during 1. INTRODUCTION 8 the numerical factorization.…”
Section: Introductionmentioning
confidence: 99%