2023
DOI: 10.1016/j.neucom.2023.02.012
|View full text |Cite
|
Sign up to set email alerts
|

Formal convergence analysis on deterministic 1-regularization based mini-batch learning for RBF networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 54 publications
0
0
0
Order By: Relevance
“…For more detailed information about the advantages and efficiency of B-spline artificial neural networks, the interested reader is referred to the book [32]. The convergence properties of some gradient-based algorithms commonly utilized for training of some classes of artificial neural networks as used in this work can be examined in [33,34]. In this sense, the expected performance of the ANN depends on the correct delimitation of the training algorithm considering typical behavior of the system under analysis, starting with typical steady state conditions.…”
Section: Sliding-mode Differential-flatness Controlmentioning
confidence: 99%
“…For more detailed information about the advantages and efficiency of B-spline artificial neural networks, the interested reader is referred to the book [32]. The convergence properties of some gradient-based algorithms commonly utilized for training of some classes of artificial neural networks as used in this work can be examined in [33,34]. In this sense, the expected performance of the ANN depends on the correct delimitation of the training algorithm considering typical behavior of the system under analysis, starting with typical steady state conditions.…”
Section: Sliding-mode Differential-flatness Controlmentioning
confidence: 99%
“…The predictive ability of robust RBF networks has been compared only on a very small number of datasets with outliers [47] so that robust RBF networks started to penetrate to real applications only slowly. Moreover, the literature is void of regularized versions of robust RBF networks, although various regularization types have been often exploited for the plain (non-robust) RBF networks [27]. This paper is interested in highly robust regularized versions of RBF networks.…”
Section: Introductionmentioning
confidence: 99%