2016
DOI: 10.4236/apm.2016.66033
|View full text |Cite
|
Sign up to set email alerts
|

A Back Propagation-Type Neural Network Architecture for Solving the Complete n × n Nonlinear Algebraic System of Equations

Abstract: The objective of this research is the presentation of a neural network capable of solving complete nonlinear algebraic systems of n equations with n unknowns. The proposed neural solver uses the classical back propagation algorithm with the identity function as the output function, and supports the feature of the adaptive learning rate for the neurons of the second hidden layer. The paper presents the fundamental theory associated with this approach as well as a set of experimental results that evaluate the pe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 29 publications
0
1
0
Order By: Relevance
“…The new updating rule dynamically adjusts the step size during training, and rotates after a set number of iterations between using the Manhattan updating rule and the Adam updating rule to efficiently handle problems with sparse and vanishing gradients. Also expanding upon the work of Margaris et al [174], Goulianas et al introduced the General Back-propagation with Adapative Learning Rate (GBALR) algorithm which allows the activation functions of the second hidden layer to be any function, including non-algebraic functions [176].…”
Section: Neural Network-based Optimizationmentioning
confidence: 99%
“…The new updating rule dynamically adjusts the step size during training, and rotates after a set number of iterations between using the Manhattan updating rule and the Adam updating rule to efficiently handle problems with sparse and vanishing gradients. Also expanding upon the work of Margaris et al [174], Goulianas et al introduced the General Back-propagation with Adapative Learning Rate (GBALR) algorithm which allows the activation functions of the second hidden layer to be any function, including non-algebraic functions [176].…”
Section: Neural Network-based Optimizationmentioning
confidence: 99%