2021
DOI: 10.3390/en14165096
|View full text |Cite
|
Sign up to set email alerts
|

Generalization Capability of Convolutional Neural Networks for Progress Variable Variance and Reaction Rate Subgrid-Scale Modeling

Abstract: Deep learning has recently emerged as a successful approach to produce accurate subgrid-scale (SGS) models for Large Eddy Simulations (LES) in combustion. However, the ability of these models to generalize to configurations far from their training distribution is still mainly unexplored, thus impeding their application to practical configurations. In this work, a convolutional neural network (CNN) model for the progress-variable SGS variance field is trained on a canonical premixed turbulent flame and evaluate… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 54 publications
0
2
0
Order By: Relevance
“…Among the four configurations, two different ratios are essentially conserved: the ∆{η « 8 and ∆{δL « 1.45. Those might then represent ratios fundamental to achieve extrapolation capabilities, as it was previously found when predicting the progress variable variance [19]. The relative importance of these two parameters on the predictive capability is then analyzed on the following sections.…”
Section: Generalization At Different Reynolds Numbersmentioning
confidence: 97%
See 1 more Smart Citation
“…Among the four configurations, two different ratios are essentially conserved: the ∆{η « 8 and ∆{δL « 1.45. Those might then represent ratios fundamental to achieve extrapolation capabilities, as it was previously found when predicting the progress variable variance [19]. The relative importance of these two parameters on the predictive capability is then analyzed on the following sections.…”
Section: Generalization At Different Reynolds Numbersmentioning
confidence: 97%
“…This might cause convergence issues in a-posteriori applications, as suggested by Lapeyre et al [17]. Attili et al [18] and Xing et al [19], using similar architectures, performed some preliminary a-priori generalization studies. Their work demonstrated that the CNNs can be trained on canonical simple cases and applied to practical configurations indicating good extrapolation capabilities.…”
Section: Introductionmentioning
confidence: 99%