2020
DOI: 10.1007/s00521-020-05340-5
|View full text |Cite
|
Sign up to set email alerts
|

The neural network collocation method for solving partial differential equations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 25 publications
(9 citation statements)
references
References 24 publications
0
9
0
Order By: Relevance
“…a range [0.5, 3], [0.5, 4.5], [1, 2.5], [1,4], [2.5, 6.5] for first, second, third, fourth and fifth layer respectively. The minimum RMSE for S6 and S7 tuples (S6 * and S7 * ) are 0.0017 and 0.0009 respectively.…”
Section: Declarationsmentioning
confidence: 99%
See 1 more Smart Citation
“…a range [0.5, 3], [0.5, 4.5], [1, 2.5], [1,4], [2.5, 6.5] for first, second, third, fourth and fifth layer respectively. The minimum RMSE for S6 and S7 tuples (S6 * and S7 * ) are 0.0017 and 0.0009 respectively.…”
Section: Declarationsmentioning
confidence: 99%
“…Recent development in the field is done by Jafarian and Baleanu [17], Weinan et al [11], Raissi et. al [21], Brink et al [4], Moseley and Markham [1], Dwivedi and Srinivasan [10], Samaniego et al [31]. Moseley and Markham [1] defined the extended version of NN called physics informed neural networks (PINNs) and Dwivedi and Srinivasan [10] discussed Physics informed extreme learning machine (PIELM).…”
Section: Introductionmentioning
confidence: 99%
“…In general, the activation function selects nonlinear functions, such as sigmoid and tanh. Here, we choose tanh as the activation function 22,66 , i.e.,…”
Section: Neural Network and Data Sampling Pinn With Residual Unitsmentioning
confidence: 99%
“…These neurons, which are simple processing units, each have weights that return weighted signals and an output signal, which is achieved using an activation function. The MLP reduces error by optimisation algorithms or functions, such as backpropagation [10,25,47].…”
Section: Multi-layer Perceptron (Mlp)mentioning
confidence: 99%