1994
DOI: 10.1109/72.265957
|View full text |Cite
|
Sign up to set email alerts
|

Evolving space-filling curves to distribute radial basis functions over an input space

Abstract: An evolutionary neural network training algorithm is proposed for radial basis function (RBF) networks. The locations of basis function centers are not directly encoded in a genetic string, but are governed by space-filling curves whose parameters evolve genetically. This encoding causes each group of codetermined basis functions to evolve to fit a region of the input space. A network produced from this encoding is evaluated by training its output connections only. Networks produced by this evolutionary algori… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
32
0

Year Published

2003
2003
2011
2011

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 97 publications
(32 citation statements)
references
References 24 publications
0
32
0
Order By: Relevance
“…A very complete state of the art of the different approaches and characteristics of a wide range of EA and RBFNN combinations is given in [15]. For example, RBFNN design has been approached by using evolutionary clustering techniques with local search [16], hybrid algorithms [17], [18], or multiobjective algorithms [19], or by evolving only the basis functions [20], [21] or space-filling curves to generate the RBF centers [22].…”
Section: Related Workmentioning
confidence: 99%
“…A very complete state of the art of the different approaches and characteristics of a wide range of EA and RBFNN combinations is given in [15]. For example, RBFNN design has been approached by using evolutionary clustering techniques with local search [16], hybrid algorithms [17], [18], or multiobjective algorithms [19], or by evolving only the basis functions [20], [21] or space-filling curves to generate the RBF centers [22].…”
Section: Related Workmentioning
confidence: 99%
“…Alternatively, all the parameters of the RBF network, the RBF centres, variances or covariance matrices, and weights, can be learnt together via a non-linear optimisation [24] . The optimisation process associated with this nonlinear learning approach, however, is highly complex and non-convex, and the genetic algorithm (GA) has been suggested to solve this type of nonlinear learning problems [25] , at the cost of an increased computational complexity.…”
Section: Introductionmentioning
confidence: 99%
“…The parameters of the RBF network, which include the center vectors and variances or covariance matrices of its hidden nodes, as well as the weights that connect the RBF nodes to the network output, can be trained together via nonlinear optimization using gradient based algorithms [27]- [31], the expectation-maximization (EM) algorithm [32], [33], or various evolutionary algorithms [34]- [38]. Generally speaking, learning based on such a nonlinear approach is computationally expensive and may encounter the problem of local minima.…”
mentioning
confidence: 99%