2020
DOI: 10.1063/5.0020526
|View full text |Cite
|
Sign up to set email alerts
|

Deep neural networks for nonlinear model order reduction of unsteady flows

Abstract: Unsteady fluid systems are nonlinear high-dimensional dynamical systems that may exhibit multiple complex phenomena in both time and space. Reduced Order Modeling (ROM) of fluid flows has been an active research topic in the recent decade with the primary goal to decompose complex flows into a set of features most important for future state prediction and control, typically using a dimensionality reduction technique. In this work, a novel data-driven technique based on the power of deep neural networks for ROM… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
51
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 127 publications
(51 citation statements)
references
References 60 publications
0
51
0
Order By: Relevance
“…However, applying sequence models to predict high-dimensional systems remains a challenge due to their high memory overhead. Dimensionality reduction techniques, such as CNN autoencoders [33,32,26,22,29,16,11,27], POD [44,48,5,31,18,8,47,10], or Koopman operators [24,9,14] can be used to construct a lowdimensional latent space. The auto-regressive sequence model then operates on these linear (POD modes) or nonlinear (CNNs) latents.…”
Section: Related Workmentioning
confidence: 99%
“…However, applying sequence models to predict high-dimensional systems remains a challenge due to their high memory overhead. Dimensionality reduction techniques, such as CNN autoencoders [33,32,26,22,29,16,11,27], POD [44,48,5,31,18,8,47,10], or Koopman operators [24,9,14] can be used to construct a lowdimensional latent space. The auto-regressive sequence model then operates on these linear (POD modes) or nonlinear (CNNs) latents.…”
Section: Related Workmentioning
confidence: 99%
“…Hence, we turn our attention to recurrent neural networks that have been shown to be successful for a variety of Natural Language Processing tasks. Recurrent architectures such as LSTM [60] provide a back-propagation mechanism acting through a sequence of inputs (such as time steps of a simulation), thereby allowing the network to learn temporal dynamics [34,61].…”
Section: B Long Short-term Memory (Lstm)mentioning
confidence: 99%
“…Alternatively, nonlinear dimension reduction techniques such as kernel POD [14] or deep learning-based approaches like autoencoders [15,16] have also been used for extracting a reduced basis. Combining autoencoder-generated bases with various specialized machine learning algorithms for time series modeling result in fully non-intrusive reduced order models [17,18,19]. Hybrid methods [20,21] can also be obtained by combining a nonlinear manifold learning technique like autoencoder for discovering the latent space with an intrusive method for the temporal dynamics.…”
Section: Introductionmentioning
confidence: 99%