2015
DOI: 10.1016/j.amc.2015.05.122
|View full text |Cite
|
Sign up to set email alerts
|

Modeling of complex dynamic systems using differential neural networks with the incorporation of a priori knowledge

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 37 publications
0
4
0
Order By: Relevance
“…In addition, in the constrained backpropagation training (Di Muro & Ferrari, 2008;Ferrari & Jensenius, 2008;He, Reif, & Unbehauen, 2000;Rudd, Muro, & Ferrari, 2014), the prior knowledge of boundary conditions was explicitly embedded as equality constraints and imposed on the weights of neural networks. Moreover, the prior knowledge of model forms and boundary conditions can be embedded as regularization terms in the objective function of a neural network to solve ODEs (Bellamine, Almansoori, & Elkamel, 2015;Malek & Shekari Beidokhti, 2006). The prior knowledge of model forms and boundary conditions can also be embedded as regularization terms in the objective function of a neural network by transforming the original PDEs into their weighted residual forms (Dissanayake & Phan-Thien, 1994).…”
Section: Physics-constrained Machine Learningmentioning
confidence: 99%
“…In addition, in the constrained backpropagation training (Di Muro & Ferrari, 2008;Ferrari & Jensenius, 2008;He, Reif, & Unbehauen, 2000;Rudd, Muro, & Ferrari, 2014), the prior knowledge of boundary conditions was explicitly embedded as equality constraints and imposed on the weights of neural networks. Moreover, the prior knowledge of model forms and boundary conditions can be embedded as regularization terms in the objective function of a neural network to solve ODEs (Bellamine, Almansoori, & Elkamel, 2015;Malek & Shekari Beidokhti, 2006). The prior knowledge of model forms and boundary conditions can also be embedded as regularization terms in the objective function of a neural network by transforming the original PDEs into their weighted residual forms (Dissanayake & Phan-Thien, 1994).…”
Section: Physics-constrained Machine Learningmentioning
confidence: 99%
“…The neural network is designed to perform the monitoring tasks defined by input sequences and sequences of desired values corresponding to the output neurons (reference model) [22][23][24][25]. The NARX dynamic neural network structure proposed in this work is shown in Figure 6.…”
Section: Vibrations Modeling Using Dynamic Neural Networkmentioning
confidence: 99%
“…However, the quality of the prediction can be greatly improved by using customized multi-objective loss functions during the network training phase. Although at the early stages of NN analysis custom loss functions were not commonly used, the idea of a multi-objective loss function has attracted considerable interest particularly in physics-based NN modeling [38][39][40]. This aspect will be discussed in the following sections.…”
Section: Introductionmentioning
confidence: 99%