2021
DOI: 10.1016/j.neucom.2020.09.006
|View full text |Cite
|
Sign up to set email alerts
|

Distributed learning machines for solving forward and inverse problems in partial differential equations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
17
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 62 publications
(21 citation statements)
references
References 20 publications
0
17
0
Order By: Relevance
“…Raissi et al (2019) and Raissi & Karniadakis (2018) introduced the framework of physics-informed neural network (PINN) to constrain neural networks with PDE derivatives computed using Automatic Differentiation (AD) Baydin et al (2018). In the past couple of years, the PINN framework has been extended to solve complicated PDEs representing complex physics (Jin et al, 2021;Mao et al, 2020;Rao et al, 2020;Wu et al, 2018;Qian et al, 2020;Dwivedi et al, 2021;Nabian et al, 2021;Kharazmi et al, 2021;Cai et al, 2021a;Bode et al, 2021;Taghizadeh et al, 2021;Lu et al, 2021c;Shukla et al, 2021;Hennigh et al, 2020;Li et al, 2021). More recently, alternate approaches that use discretization techniques using higher order derivatives and specialize numerical schemes to compute derivatives have shown to provide better regularization for faster convergence (Ranade et al, 2021b;Gao et al, 2021;Wandel et al, 2020;He & Pathak, 2020).…”
Section: Related Workmentioning
confidence: 99%
“…Raissi et al (2019) and Raissi & Karniadakis (2018) introduced the framework of physics-informed neural network (PINN) to constrain neural networks with PDE derivatives computed using Automatic Differentiation (AD) Baydin et al (2018). In the past couple of years, the PINN framework has been extended to solve complicated PDEs representing complex physics (Jin et al, 2021;Mao et al, 2020;Rao et al, 2020;Wu et al, 2018;Qian et al, 2020;Dwivedi et al, 2021;Nabian et al, 2021;Kharazmi et al, 2021;Cai et al, 2021a;Bode et al, 2021;Taghizadeh et al, 2021;Lu et al, 2021c;Shukla et al, 2021;Hennigh et al, 2020;Li et al, 2021). More recently, alternate approaches that use discretization techniques using higher order derivatives and specialize numerical schemes to compute derivatives have shown to provide better regularization for faster convergence (Ranade et al, 2021b;Gao et al, 2021;Wandel et al, 2020;He & Pathak, 2020).…”
Section: Related Workmentioning
confidence: 99%
“…The most peculiar property of the ELM is that its input layer weights are pre-set random values and fixed throughout the training process, and output layer weights are training parameters. Spirited by ELM and PINN, Dwivedi et al [26,27] develop an ELM-based PDE solver called physic-informed extreme learning machine (PIELM) as a rapid version of PINN to solve the linear PDE problems efficiently. The PDE problem is transformed into a linear least squares problem by incorporating physics laws into the ELM as the cost function.…”
Section: Introductionmentioning
confidence: 99%
“…(Deng et al, 2019) proposed a network-based generative adversarial network-based artificial intelligence framework to enhance the spatial resolution of complicated wake flow behind two side-by-side cylinders; however, (Liu et al, 2020) proposed a multitime path CNN. To reduce the increase in the visual complexity to an LR input, which is caused by data-driven upsampling approaches, various efforts (Raissi et al, 2019;Gao et al, 2020;Dwivedi et al, 2021) have been proposed. Despite the advantages, the success of these DL models mainly relies on a large quantity of offline HR data as labels.…”
Section: Introductionmentioning
confidence: 99%