2021
DOI: 10.48550/arxiv.2106.05587
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Discontinuity Capturing Shallow Neural Network for Elliptic Interface Problems

Abstract: In this paper, a new Discontinuity Capturing Shallow Neural Network (DCSNN) for approximating d-dimensional piecewise continuous functions and for solving elliptic interface problems is developed. There are three novel features in the present network; namely, (i) jump discontinuity is captured sharply, (ii) it is completely shallow consisting of only one hidden layer, (iii) it is completely mesh-free for solving partial differential equations (PDEs). We first continuously extend the d-dimensional piecewise con… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 20 publications
0
5
0
Order By: Relevance
“…Moreover, we can extend this to any function which has a finite number of discontinuities. This is done using the discontinuity capturing shallow neural network (DCSNN) technique described in [11]. The key idea is that we can extend a d-dimensional piecewise continuous function to a (d + 1)-dimensional continuous function by adding a new variable to parameterize the domain.…”
Section: Universal Approximation Theoremmentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, we can extend this to any function which has a finite number of discontinuities. This is done using the discontinuity capturing shallow neural network (DCSNN) technique described in [11]. The key idea is that we can extend a d-dimensional piecewise continuous function to a (d + 1)-dimensional continuous function by adding a new variable to parameterize the domain.…”
Section: Universal Approximation Theoremmentioning
confidence: 99%
“…As discussed earlier in the paper, the DCSNN technique described in [11] is utilized to deal with the discontinuous constant coefficients across the two domains. Using the protocol described in equations 11 and 12, an extra feature is introduced to extend the discontinuous function in d dimensions to a continuous function in d + 1 dimensions.…”
Section: Algorithm Formulationmentioning
confidence: 99%
“…However, the above immersed interface method becomes more difficult and tedious to implement when the space dimension increases. Recently, the authors of this paper propose a machine learning method called discontinuity capturing shallow neural network (DCSNN) [17] to solve more general elliptic interface problems than the one in Eq. ( 3) and successfully obtain the comparable results with IIM.…”
Section: Elliptic Problems With Singular Sources On the Interfacementioning
confidence: 99%
“…We note that the construction of such a set of functions using neural net approximation requires additional efforts. Here, inspired by DCSNN [17] where an augmented variable is introduced to categorize precisely the variables into each subdomains, we also introduce an augmented variable and require the function to be continuous throughout the whole domain. More precisely, consider a level set function φ(x) such that the position of the interface Γ is given by the zero level set, i.e.,…”
Section: Level Set Function Augmentationmentioning
confidence: 99%
“…They reformulated the interface problem as a least-squares problem and solved it by stochastic gradient de-scent. [22] proposed the discontinuity capturing shallow neural network (DCSNN) to approximate piecewise continuous functions and solved elliptic interface problems by minimizing the mean squared error loss consistent with the residual of the equation, boundary and interface jump conditions.…”
mentioning
confidence: 99%