2011
DOI: 10.1109/tnn.2011.2109736
|View full text |Cite
|
Sign up to set email alerts
|

Reduced HyperBF Networks: Regularization by Explicit Complexity Reduction and Scaled Rprop-Based Training

Abstract: Hyper basis function (HyperBF) networks are generalized radial basis function neural networks (where the activation function is a radial function of a weighted distance). Such generalization provides HyperBF networks with high capacity to learn complex functions, which in turn make them susceptible to overfitting and poor generalization. Moreover, training a HyperBF network demands the weights, centers, and local scaling factors to be optimized simultaneously. In the case of a relatively large dataset with a l… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 42 publications
0
7
0
Order By: Relevance
“…When classification is made by machine learning, the SVM method is considered as the most successful classification method due to its high accuracy and fast speed. The classification accuracy of HBFNNs based on regularization optimization can be comparable to SVM (Mahdi & Rouchka, 2011).…”
Section: Regularization Methodsmentioning
confidence: 98%
See 4 more Smart Citations
“…When classification is made by machine learning, the SVM method is considered as the most successful classification method due to its high accuracy and fast speed. The classification accuracy of HBFNNs based on regularization optimization can be comparable to SVM (Mahdi & Rouchka, 2011).…”
Section: Regularization Methodsmentioning
confidence: 98%
“…where j is a positive-definite weighted matrix. According to different requirements, the matrix j can be divided into different forms (Mahdi & Rouchka, 2011) as follows:…”
Section: An Extension Of Rbfnns -Hbfnnsmentioning
confidence: 99%
See 3 more Smart Citations