2023
DOI: 10.1177/10943420231175462
|View full text |Cite
|
Sign up to set email alerts
|

End-to-end GPU acceleration of low-order-refined preconditioning for high-order finite element discretizations

Abstract: In this article, we present algorithms and implementations for the end-to-end GPU acceleration of matrix-free low-order-refined preconditioning of high-order finite element problems. The methods described here allow for the construction of effective preconditioners for high-order problems with optimal memory usage and computational complexity. The preconditioners are based on the construction of a spectrally equivalent low-order discretization on a refined mesh, which is then amenable to, for example, algebrai… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 43 publications
0
1
0
Order By: Relevance
“…GPU accelerators have emerged as a powerful tool in many areas of science and engineering. They are used for accelerating not only traditional high-performance computing (HPC) applications (e.g., [1,2]) but also for the training and application of AI models (e.g., [3,4]). Many supercomputers in the top 500 list (https://www.top500.org (accessed on 15 November 2023)) obtain most of their computing power from these accelerators (e.g., see [5] for a recent review).…”
Section: Introductionmentioning
confidence: 99%
“…GPU accelerators have emerged as a powerful tool in many areas of science and engineering. They are used for accelerating not only traditional high-performance computing (HPC) applications (e.g., [1,2]) but also for the training and application of AI models (e.g., [3,4]). Many supercomputers in the top 500 list (https://www.top500.org (accessed on 15 November 2023)) obtain most of their computing power from these accelerators (e.g., see [5] for a recent review).…”
Section: Introductionmentioning
confidence: 99%