1991
DOI: 10.1109/72.80341
|View full text |Cite
|
Sign up to set email alerts
|

Orthogonal least squares learning algorithm for radial basis function networks

Abstract: The radial basis function network offers a viable alternative to the two-layer neural network in many applications of signal processing. A common learning algorithm for radial basis function networks is based on first choosing randomly some data points as radial basis function centers and then using singular-value decomposition to solve for the weights of the network. Such a procedure has several drawbacks, and, in particular, an arbitrary selection of centers is clearly unsatisfactory. The authors propose an … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
1,295
0
29

Year Published

1999
1999
2012
2012

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 2,967 publications
(1,326 citation statements)
references
References 18 publications
2
1,295
0
29
Order By: Relevance
“…Very similar methods that also construct SLFNs sequentially had been reported earlier ( [3], [16], [12]). They all find the optimal linear weights of the output layer by solving the same linear system.…”
Section: Introductionmentioning
confidence: 70%
See 3 more Smart Citations
“…Very similar methods that also construct SLFNs sequentially had been reported earlier ( [3], [16], [12]). They all find the optimal linear weights of the output layer by solving the same linear system.…”
Section: Introductionmentioning
confidence: 70%
“…This method, which has been referred to as SV-SFNNs [13] or SAOCIF with Input strategy [12], is essentially equivalent to the Orthogonal Least Squares Learning algorithm [3] and to the Kernel Matching Pursuit with pre-fitting method [16]. In order to assess the relative performance of both approaches (EM-ELMs vs. SV-SFNNs) in a fair manner, an empirical study has been realized on twenty benchmark data sets, 10 for classification and 10 for regression, under the same conditions and using the same software.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…However, their training involves learning not only the weights, but also the number of radial basis functions, the position of their centres and their width. To accomplish this task, we applied the orthogonal least squares algorithm [34]. The optimum spread and number of radial basis functions were experimentally determined.…”
Section: Radial Basis Functionmentioning
confidence: 99%