2021
DOI: 10.48550/arxiv.2112.14038
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

DAS: A deep adaptive sampling method for solving partial differential equations

Abstract: In this work we propose a deep adaptive sampling (DAS) method for solving partial differential equations (PDEs), where deep neural networks are utilized to approximate the solutions of PDEs and deep generative models are employed to generate new collocation points that refine the training set. The overall procedure of DAS consists of two components: solving the PDEs by minimizing the residual loss on the collocation points in the training set and generating a new training set to further improve the accuracy of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(17 citation statements)
references
References 32 publications
0
17
0
Order By: Relevance
“…However, for high-dimensional problems, we need to use other methods, such as generative adversarial networks (GANs) [46], as was done in Ref. [41]. Moreover, the probability of sampling a point x is only considered as p(x) ∝ ε k (x) E[ε k (x)] + c. While this probability works very well in this study, it is possible that there exists another better choice.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…However, for high-dimensional problems, we need to use other methods, such as generative adversarial networks (GANs) [46], as was done in Ref. [41]. Moreover, the probability of sampling a point x is only considered as p(x) ∝ ε k (x) E[ε k (x)] + c. While this probability works very well in this study, it is possible that there exists another better choice.…”
Section: Discussionmentioning
confidence: 99%
“…During the preparation of this paper, a few new studies appeared [38,39,40,41,42,43,44] that also proposed modified versions of RAR or PDF-based resampling. Most of these methods are special cases of the proposed RAD and RAR-D, and our methods can achieve better performance.…”
Section: Related Work and Our Contributionsmentioning
confidence: 99%
See 2 more Smart Citations
“…[23] found it challenging for the vanilla PINN to solve the Allen-Cahn and Hilliard equations and improved results after adopting a similar RAR strategy. Tang et al [38] introduced an additional neural network acting as an error indicator to guide the refinement, which is aimed at avoiding the large variance in uniform random sampling. Hanna et al [39] estimated probalistic density functions to adaptively generate addtitional nodes based on the PDE residuals.…”
Section: Introductionmentioning
confidence: 99%