2009
DOI: 10.1007/s00466-009-0450-z
|View full text |Cite
|
Sign up to set email alerts
|

Cardiovascular flow simulation at extreme scale

Abstract: As cardiovascular models grow more sophisticated in terms of the geometry considered, and more physiologically realistic boundary conditions are applied, and fluid flow is coupled to structural models, the computational complexity grows. Massively parallel adaptivity and flow solvers with extreme scalability enable cardiovascular simulations to reach an extreme scale while keeping the time-to-solution reasonable. In this paper, we discuss our efforts in this area and provide two demonstrations: one on an extre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2011
2011
2018
2018

Publication Types

Select...
4
2
1

Relationship

3
4

Authors

Journals

citations
Cited by 39 publications
(22 citation statements)
references
References 37 publications
(48 reference statements)
0
22
0
Order By: Relevance
“…[27] This capability has been [28] extended to two-phase flows where we use the level set method to track the boundary between two immiscible fluids (either compressible or incompressible -to model bubble coalescence [29] and two-phase turbulence [9,10]). PHASTA can use anisotropically adapted unstructured grids and regular grids and its highly scalable performance on massively parallel computers has already been demonstrated (the code has shown good scaling out to 288 * 1024 IBM Blue Gene processors, at JUGENE, BG/P (Germany) [30] and more recently up to 768,000 cores on Mira at Argonne National Laboratory).…”
Section: Description Of the Simulation 21 Dns Solver Overviewmentioning
confidence: 94%
“…[27] This capability has been [28] extended to two-phase flows where we use the level set method to track the boundary between two immiscible fluids (either compressible or incompressible -to model bubble coalescence [29] and two-phase turbulence [9,10]). PHASTA can use anisotropically adapted unstructured grids and regular grids and its highly scalable performance on massively parallel computers has already been demonstrated (the code has shown good scaling out to 288 * 1024 IBM Blue Gene processors, at JUGENE, BG/P (Germany) [30] and more recently up to 768,000 cores on Mira at Argonne National Laboratory).…”
Section: Description Of the Simulation 21 Dns Solver Overviewmentioning
confidence: 94%
“…Since there is little computation performed during mesh adaptation relative to the substantial increase in communications required as the given mesh is distributed to more processors, the scaling decreases on high core counts (note that a strong scaling study is performed and therefore, the problem size is the same). However, the analysis have been shown to scale strongly with the similar amount of work load provided [21,41]. The fact that mesh modification routines are able to scale on bigger core count with more entities involved into communication supports the statement that it is likely to at a minimum provide the equivalent scaling with more work load on the same number of parts.…”
Section: Parallel Anisotropic Boundary Layer Adaptivity On a Heat Tramentioning
confidence: 63%
“…Such meshes can only be solved using massively parallel computers [19][20][21][22]. To effectively execute such simulations the mesh adaptation procedures must operate in parallel on the same computer as the flow solution using the same form of parallel decomposition which is commonly represented as a partitioned mesh [20,21,23].…”
Section: Introductionmentioning
confidence: 99%
“…PHASTA [14] is a parallel, hierarchic, high-order accurate, adaptive, stabilized finite element solver for transient analysis of incompressible and compressible flows using advanced anisotropic adaptive algorithms and numerical models of flow physics. Its highly scalable performance on massively parallel computers has already been demonstrated on a variety of supercomputers such as Cray and IBM BlueGene [15] [16]. …”
Section: Description Of Our Approachmentioning
confidence: 97%