2021
DOI: 10.4208/cicp.oa-2020-0106
|View full text |Cite
|
Sign up to set email alerts
|

Enforcing Imprecise Constraints on Generative Adversarial Networks for Emulating Physical Systems

Abstract: Generative adversarial networks (GANs) were initially proposed to generate images by learning from a large number of samples. Recently, GANs have been used to emulate complex physical systems such as turbulent flows. However, a critical question must be answered before GANs can be considered trusted emulators for physical systems: do GANs-generated samples conform to the various physical constraints? These include both deterministic constraints (e.g., conservation laws) and statistical constraints (e.g., energ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 51 publications
0
2
0
Order By: Relevance
“…As a result, introducing physical limitations to the generator loss function gives an alternate method for dealing with the vanishing gradient problem in GAN training. Moreover, adding a physics penalty gives an extra edge to the generator which accelerates its convergence and helps the discriminator achieve equilibrium [149]. Figure 10 represents the block diagram for PI-GAN.…”
Section: Resolving Data Insufficiency or Class-imbalance Problemmentioning
confidence: 99%
“…As a result, introducing physical limitations to the generator loss function gives an alternate method for dealing with the vanishing gradient problem in GAN training. Moreover, adding a physics penalty gives an extra edge to the generator which accelerates its convergence and helps the discriminator achieve equilibrium [149]. Figure 10 represents the block diagram for PI-GAN.…”
Section: Resolving Data Insufficiency or Class-imbalance Problemmentioning
confidence: 99%
“…Our approach leverages known equations for computing a coarse-grid prior; which is complementary to using known equations as soft [109,74,142,137,144,139] or hard constraints [50,89,8,39,7,61] as these methods can still be used to constrain the learned parametrization. In terms of symmetry, our approach exploits translational equivariance via Fourier transformations [76], but can be extended to other frameworks that exploit in-or equivariance of PDEs [95] to rotational [44,124], Galilean [136,105], scale [9], translational [123], reflectional [37] or permutational [145] transformations.…”
Section: Related Workmentioning
confidence: 99%