2019
DOI: 10.1111/cgf.13620
|View full text |Cite
|
Sign up to set email alerts
|

Latent Space Physics: Towards Learning the Temporal Evolution of Fluid Flow

Abstract: We propose a method for the data‐driven inference of temporal evolutions of physical functions with deep learning. More specifically, we target fluid flow problems, and we propose a novel LSTM‐based approach to predict the changes of the pressure field over time. The central challenge in this context is the high dimensionality of Eulerian space‐time data sets. We demonstrate for the first time that dense 3D+time functions of physics system can be predicted within the latent spaces of neural networks, and we ar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
172
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 204 publications
(174 citation statements)
references
References 51 publications
0
172
0
Order By: Relevance
“…Complete replacement of equations with ML algorithms for gaining computational efficiency when generalization is not a necessity (e.g., we refer to [318]- [325] and references cited therein for speeding up numerical solvers for real-time applications) can be used in the context of digital twin when the same process is to be monitored time and again. However, this approach will fail in unexpected situation because the model only learns to interpolate and not extrapolate.…”
Section: Nonintrusive Data-driven Modelingmentioning
confidence: 99%
“…Complete replacement of equations with ML algorithms for gaining computational efficiency when generalization is not a necessity (e.g., we refer to [318]- [325] and references cited therein for speeding up numerical solvers for real-time applications) can be used in the context of digital twin when the same process is to be monitored time and again. However, this approach will fail in unexpected situation because the model only learns to interpolate and not extrapolate.…”
Section: Nonintrusive Data-driven Modelingmentioning
confidence: 99%
“…Given sufficient amounts of training data, such models, or regression neural networks, are capable of learning complex, non-linear mappings from a high-dimensional feature vector to a desired output. This property makes these models useful to solve novel kinds of problems also in fields such as physics and materials science [7][8][9][10][11][12][13] , and related activities have very recently gained significant momentum 14 .…”
mentioning
confidence: 99%
“…LSTM is a variant of recurrent neural networks capable of learning and predicting the temporal dependencies between the given data sequence based on the input information and previously acquired information. Recurrent neural networks have been used successfully in ROM community to enhance standard projection ROMs [89] and build fully non-intrusive ROMs [90][91][92][93][94][95]. In the present study, we use LSTMs to augment the standard physics-informed ROM by introducing closure as well as super-resolution data-driven models.…”
Section: Long Short-term Memory Embeddingmentioning
confidence: 99%