2020
DOI: 10.1111/cgf.14097
|View full text |Cite
|
Sign up to set email alerts
|

Latent Space Subdivision: Stable and Controllable Time Predictions for Fluid Flow

Abstract: We propose an end‐to‐end trained neural network architecture to robustly predict the complex dynamics of fluid flows with high temporal stability. We focus on single‐phase smoke simulations in 2D and 3D based on the incompressible Navier‐Stokes (NS) equations, which are relevant for a wide range of practical problems. To achieve stable predictions for long‐term flow sequences with linear execution times, a convolutional neural network (CNN) is trained for spatial compression in combination with a temporal pred… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 28 publications
(34 citation statements)
references
References 24 publications
0
34
0
Order By: Relevance
“…This work is complementary to the work presented in a companion paper that studied enforcing statistical constraints [36]. While precise constraints have previously been used as regularization in various fields (e.g., natural language processing [37], lake temperature modeling [38,39], general dynamical systems [25], fluid flow simulations [9,11,12,40], and more specifically in turbulent flow simulations and generation [41][42][43]), the effects, performances, and best practices of imposing imprecise constraints in generative models still need further investigations.…”
Section: Scope and Contributions Of Present Workmentioning
confidence: 87%
See 2 more Smart Citations
“…This work is complementary to the work presented in a companion paper that studied enforcing statistical constraints [36]. While precise constraints have previously been used as regularization in various fields (e.g., natural language processing [37], lake temperature modeling [38,39], general dynamical systems [25], fluid flow simulations [9,11,12,40], and more specifically in turbulent flow simulations and generation [41][42][43]), the effects, performances, and best practices of imposing imprecise constraints in generative models still need further investigations.…”
Section: Scope and Contributions Of Present Workmentioning
confidence: 87%
“…In recent years, machine learning has been widely adopted in scientific applications, leading to an emerging field referred to as scientific machine learning. Example scientific applications of machine learning include augmenting or constructing data-driven turbulence models [6][7][8], generating realistic animations of flows [9][10][11][12] discovering or solving differential equations [13][14][15][16][17][18][19].…”
Section: Physical Applications Of Gans: Progress and Challengesmentioning
confidence: 99%
See 1 more Smart Citation
“…This leads to better accuracy. In (Wiewel et al ., 2018; 2020), the authors were mainly interested in computational speed‐up and robust long‐term predictions for fluid flow simulations. These works thus demonstrated the capability of data‐driven approaches for modeling fluid dynamics: they implemented a CAE for spatial compression and stacked long‐short term memory (LSTM) layers to define their surrogate network, that is, a network that performs time propagation.…”
Section: Introductionmentioning
confidence: 99%
“…A first ingredient of the approach we introduce in the present paper is to replace the (supposedly expensive) time integration of the model with an NN surrogate. Time‐stepping methods based on surrogates have already been explored by several authors (Lu et al ., 2007), (Wiewel et al ., 2018; 2020), (Maulik et al ., 2021), (Vlachas et al ., 2018), (Brajard et al ., 2020), (Pawar and San, 2020). A key question that arises immediately is how to ensure the time stability of the resulting scheme (see (Haber and Ruthotto, 2017), (Haber et al ., 2019) for an investigation within deep learning surrogate models) when the model is repeatedly called to propagate the state over several time steps.…”
Section: Introductionmentioning
confidence: 99%