2014
DOI: 10.1016/j.ins.2014.05.036
|View full text |Cite
|
Sign up to set email alerts
|

Constructing ordinary sum differential equations using polynomial networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
10
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
2
1

Relationship

3
4

Authors

Journals

citations
Cited by 21 publications
(10 citation statements)
references
References 21 publications
0
10
0
Order By: Relevance
“…Only some of all the potential fraction terms (neurons) may be included in a PDE solution, whose optimal combination can be searched for by the binary Particle Swarm Optimization (PSO). The Gradient Steepest Descent (GSD) method can update the block and neuron polynomial parameters (Zjavka and Snášel, 2014). The root mean square error (RMSE) was applied for the training, test and…”
Section: Appendix Bmentioning
confidence: 99%
“…Only some of all the potential fraction terms (neurons) may be included in a PDE solution, whose optimal combination can be searched for by the binary Particle Swarm Optimization (PSO). The Gradient Steepest Descent (GSD) method can update the block and neuron polynomial parameters (Zjavka and Snášel, 2014). The root mean square error (RMSE) was applied for the training, test and…”
Section: Appendix Bmentioning
confidence: 99%
“…The binary PSO uses binary operators and the corresponding coefficients settings in the standard PSO (velocity) equations to form new individuals [15]. Parameters of polynomials and PDE term weights are represented by real numbers, randomly initialized from the interval <0.5, 1.5> and adjusted by means of the gradient steepest descent method [17] combined with a difference evolution algorithm (EA) [16], performed simultaneously with the best-fit neuron combination search [18]. The D-PNN (also GMDH) approximation ability of complicated periodic functions is possible to improve by means of a sigmoidal transformation (sig) of the squared power items together with their parameters in both the neuron and block output polynomials (30).…”
Section: Fig 4 3-variable Multi-layer D-pnn With 2-variable Combination Blocksmentioning
confidence: 99%
“…While using 2 input variables an equivalent 2 nd order PDE (3) may be expressed in the form(18), which derivative variables of the PDE terms, correspond exactly to all the GMDH polynomial (2) variables. The 2-variable block neurons form and substitute for all the relevant partial derivative terms, so each block includes 5 simple neurons formed with respect to 2 single linear x1, x2(19), 2 squared x1 2 , x2 2 (20) and 1 combination x1x2 (21) derivative variables of the 2 nd order PDE substitution (18) in a searched 2-variable u function model.…”
mentioning
confidence: 99%
“…Differential equations can describe physical or natural systems which it is difficult to model by unique exact functions; the solutions can apply power [8] or wave series [9], genetic programming [10] and neural networks [11]. Polynomial networks may be extended to apply some mathematical principles (Section 2) to form and substitute for general differential equations [12].…”
Section: Introductionmentioning
confidence: 99%
“…Multi-layer differential polynomial neural network A multi-layer PNN forms composite polynomial functions (Fig. 2); the previous layers form internal functions y i (12), which substitute for the next hidden layer input variables of neuron and block polynomials to form external functions f(y) (13). Composite PDE terms, i.e.…”
Section: Introductionmentioning
confidence: 99%