2020
DOI: 10.48550/arxiv.2011.10277
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Model order reduction with neural networks: Application to laminar and turbulent flows

Kai Fukami,
Kazuto Hasegawa,
Taichi Nakamura
et al.

Abstract: We investigate the capability of neural network-based model order reduction, i.e., autoencoder (AE), for fluid flows. As an example model, an AE which comprises of a convolutional neural network and multi-layer perceptrons is considered in this study. The AE model is assessed with four canonical fluid flows, namely: (1) two-dimensional cylinder wake, (2) its transient process, (3) NOAA sea surface temperature, and (4) y − z sectional field of turbulent channel flow, in terms of a number of latent modes, a choi… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
5

Relationship

4
1

Authors

Journals

citations
Cited by 5 publications
(9 citation statements)
references
References 67 publications
0
9
0
Order By: Relevance
“…One of them is an application to turbulence, although this is extremely challenging as compared to the present demonstration with the square cylinder wake. To that end, we may be able to capitalize on the idea of autoencoder-based low dimensionalization [49], in addition to the present adaptive sampling. But we should caution that there may be a trade-off relationship between the compressibility and explainability of low-dimensionalized representation.…”
Section: Discussionmentioning
confidence: 99%
“…One of them is an application to turbulence, although this is extremely challenging as compared to the present demonstration with the square cylinder wake. To that end, we may be able to capitalize on the idea of autoencoder-based low dimensionalization [49], in addition to the present adaptive sampling. But we should caution that there may be a trade-off relationship between the compressibility and explainability of low-dimensionalized representation.…”
Section: Discussionmentioning
confidence: 99%
“…The filters, trainable parameters inside the CNN, are able to handle high-dimensional data efficiently and extract key features. Thanks to its unique capability in handling high-dimensional data, the use of CNN has also been spread in the fluid dynamics field in recent years [33,34,35,36,37,38,39,40,41,42].…”
Section: Convolutional Neural Networkmentioning
confidence: 99%
“…By renormalizing the initial graph to include self-loops (i.e. adding identity to the adjacency and degree matrices), they proposed the graph convolutional network (GCN) which obeys the propagation rule (5) x +1 = σ Px W , where x ∈ R |V|×n is a signal on the graph with n channels, W ∈ R n ×n +1 is a (potentially nonsquare) weight matrix containing the learnable parameters and…”
Section: A Graph Convolutional Autoencoder For Rommentioning
confidence: 99%
“…The work of Lee and Carlberg in [4] demonstrated that deep convolutional autoencoders (CAEs) can overcome the Kolmogorov width barrier for advection-dominated systems which limits linear ROM methods, leading to a variety of similar ROMs based on this architecture seen in e.g. [5,6,7,8]. Moreover, works such as [6,7] have experimented with entirely data-driven ROMs based on deep CAEs, and seen success using either fully connected or recurrent long short term networks to simulate the reduced low-dimensional dynamics.…”
Section: Introductionmentioning
confidence: 99%