2021
DOI: 10.2478/jaiscr-2021-0017
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Fast Feedforward Neural Networks Training Algorithm

Abstract: In this paper1 a new neural networks training algorithm is presented. The algorithm originates from the Recursive Least Squares (RLS) method commonly used in adaptive filtering. It uses the QR decomposition in conjunction with the Givens rotations for solving a normal equation - resulting from minimization of the loss function. An important parameter in neural networks is training time. Many commonly used algorithms require a big number of iterations in order to achieve a satisfactory outcome while other algor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(2 citation statements)
references
References 31 publications
0
2
0
Order By: Relevance
“…However, it requires different data, where the pro- portion between correct and abnormal data will be more even. The algorithm can also be implemented using other mechanisms of computational intelligence, for example other artificial neural network structures [22], using parallel computing mechanisms or statistical methods.…”
Section: Discussionmentioning
confidence: 99%
“…However, it requires different data, where the pro- portion between correct and abnormal data will be more even. The algorithm can also be implemented using other mechanisms of computational intelligence, for example other artificial neural network structures [22], using parallel computing mechanisms or statistical methods.…”
Section: Discussionmentioning
confidence: 99%
“…Such approximation of the maximum function, called a smooth maximum function, is used in this article to construct a novel type-reduction method for type-2 interval fuzzy logic. Properties of the smooth maximum function are useful for optimization of interval type-2 fuzzy systems with such techniques as gradient descent [2], conjugate gradient [3], or second-order gradient [1], smoothly optimizes the interval type-2 fuzzy logic system as new data is available. Consequently, in this paper, we derive a new adaptive structure of the interval type-2 fuzzy logic system equipped with smooth type-reduction.…”
Section: Introductionmentioning
confidence: 99%