1981
DOI: 10.1145/355972.355980
|View full text |Cite
|
Sign up to set email alerts
|

Solving Large Full Sets of Linear Equations in a Paged Virtual Store

Abstract: The problem of solving large full sets of linear equations on a computer with a paged virtual memory is considered and a block column algorithm proposed. Details of software design are considered and results of experimental runs on five different computer systems are reported.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
7
0

Year Published

1985
1985
2003
2003

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 21 publications
(7 citation statements)
references
References 6 publications
0
7
0
Order By: Relevance
“…If the vectors and matrices involved are of order n, then the original BLAS (Level 1) includes operations that are of order O(n), the extended or Level 2 BLAS provides operations of order O(n 2 ), and the latest BLAS provides operations of order O(n 3 ) (hence the use of the term Level 3 BLAS). There is a long history of block algorithms: early algorithms utilized a small main memory, with tape or disk as secondary storage [12][13][14][15][16][17]. More recently, several researchers have demonstrated the effectiveness of block algorithms on a variety of modern computer architectures with vector-processing or parallel-processing capabilities [9,10,13,[17][18][19][20][21][22].…”
Section: Matrix-matrix Operationsmentioning
confidence: 99%
See 1 more Smart Citation
“…If the vectors and matrices involved are of order n, then the original BLAS (Level 1) includes operations that are of order O(n), the extended or Level 2 BLAS provides operations of order O(n 2 ), and the latest BLAS provides operations of order O(n 3 ) (hence the use of the term Level 3 BLAS). There is a long history of block algorithms: early algorithms utilized a small main memory, with tape or disk as secondary storage [12][13][14][15][16][17]. More recently, several researchers have demonstrated the effectiveness of block algorithms on a variety of modern computer architectures with vector-processing or parallel-processing capabilities [9,10,13,[17][18][19][20][21][22].…”
Section: Matrix-matrix Operationsmentioning
confidence: 99%
“…There is a long history of block algorithms: early algorithms utilized a small main memory, with tape or disk as secondary storage [12][13][14][15][16][17]. More recently, several researchers have demonstrated the effectiveness of block algorithms on a variety of modern computer architectures with vector-processing or parallel-processing capabilities [9,10,13,[17][18][19][20][21][22]. Additionally, full blocks (and hence the multiplication of full matrices) might appear as a subproblem when handling large sparse systems of equations [14,[23][24][25].…”
Section: Matrix-matrix Operationsmentioning
confidence: 99%
“…The 10 of the original algorithm is of order n 3 • It can easily be seen that the IO is reduced by a factor q (cf. Du Croz et al [7]) as opposed to the original algorithm, but it is still of order rl3 !…”
Section: S Blocked Gaussian Eliminationmentioning
confidence: 99%
“…In [7,14,15] one can find an analysis of blocked Gaussian elimination. To keep this kind of analysis simple, one has to make some simplifications.…”
Section: Blocked Gaussian Elimination Analysismentioning
confidence: 99%
“…Calahan [149]. The block form of Version 4 was also considered in a virtual memory setting by Du Croz et al in [50] and used as a model of a block LU factorization in the BLAS3 standard proposal by Dongarra et al [39]. The curves in Fig.…”
mentioning
confidence: 99%