2014
DOI: 10.2172/1178109
|View full text |Cite
|
Sign up to set email alerts
|

PETSc Users Manual (Rev. 3.5)

Abstract: vii

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
362
0
3

Year Published

2014
2014
2017
2017

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 387 publications
(380 citation statements)
references
References 0 publications
1
362
0
3
Order By: Relevance
“…The numerical experiments are performed on STAMPEDE [34], a linux cluster hosted by the Texas Advanced Computing Center. The simulations are performed with PETIGA [13], and utilize parallel MUMPS solver [2,3,4] with parallel Scalapack [11] dense solver, parallel SuperLU solver [32,33], and parallel PaStiX solver [23] available from PETSc library [6,7,8]. In the experiments, we utilized one core per Linux cluster node in order to maximize the amount of available memory per node.…”
Section: Numerical Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The numerical experiments are performed on STAMPEDE [34], a linux cluster hosted by the Texas Advanced Computing Center. The simulations are performed with PETIGA [13], and utilize parallel MUMPS solver [2,3,4] with parallel Scalapack [11] dense solver, parallel SuperLU solver [32,33], and parallel PaStiX solver [23] available from PETSc library [6,7,8]. In the experiments, we utilized one core per Linux cluster node in order to maximize the amount of available memory per node.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…The theoretical estimates concerning the computational costs are compared with numerical experiments performed on the STAMPEDE Linux cluster from the Texas Advanced Computing Center [34]. The experiments are performed with the PETIGA toolkit [13] built on the PETSc library [6,7,8], and utilize the parallel MUMPS solver [2,3,4] with parallel Scalapack dense solver [11], the parallel SuperLU solver [32,33], and parallel PaStiX solver [23].…”
Section: Introductionmentioning
confidence: 99%
“…Deal.II expands its functionality by providing interfaces to other packages such as Trilinos, PETSc [6], METIS [15] and others. Each package is added and used via a wrapper class in Deal.II.…”
Section: Implementation Detailsmentioning
confidence: 99%
“…PERMON extends PETSc [3] with support for quadratic programming (QP) and non-overlapping domain decomposition methods (DDM), namely of the FETI (Finite Element Tearing and Interconnecting) [13,12,5,24] type. This paper presents the process of solving contact problems using PERMON (Section 3).…”
Section: Introductionmentioning
confidence: 99%