2001
DOI: 10.20965/jaciii.2001.p0229
|View full text |Cite
|
Sign up to set email alerts
|

Evolving Basis Function Networks for System Identification

Abstract: This paper is concerned with learning and optimization of different basis function networks in the aspect of structure adaptation and parameter tuning. Basis function networks include the Volterra polynomial, Gaussian radial, B-spline, fuzzy, recurrent fuzzy, and local Gaussian basis function networks. Based on creation and evolution of the type constrained sparse tree, a unified framework is constructed, in which structure adaptation and parameter adjustment of different basis function networks are addressed … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
16
0

Year Published

2006
2006
2015
2015

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 13 publications
(16 citation statements)
references
References 0 publications
0
16
0
Order By: Relevance
“…I 1 is the operator set, where {*, /,sin,cos,exp, log} F r  and { , } T x R  are function and terminal sets. *, /, sin, cos, exp, rlog, x, and R denote the multiplication, protected division (  x, y  R : when y = 0, x/0 = 1), sine, cosine, exponent, protected logarithm (  x  R, x  0 : rlog(x) = log(abs(x)) and rlog(0) = 0), system inputs, and random constant number, taking 2, 2, 1, 1, 1, 1, 0 and 0 arguments respectively [18]. According to the specific problems, we could add some complex functions into operator set I 1 , for example 11 {*, /,sin, cos, log, tan, , , ,*3,*4, , } 111…”
Section: Representation Of Additive Expression Tree Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…I 1 is the operator set, where {*, /,sin,cos,exp, log} F r  and { , } T x R  are function and terminal sets. *, /, sin, cos, exp, rlog, x, and R denote the multiplication, protected division (  x, y  R : when y = 0, x/0 = 1), sine, cosine, exponent, protected logarithm (  x  R, x  0 : rlog(x) = log(abs(x)) and rlog(0) = 0), system inputs, and random constant number, taking 2, 2, 1, 1, 1, 1, 0 and 0 arguments respectively [18]. According to the specific problems, we could add some complex functions into operator set I 1 , for example 11 {*, /,sin, cos, log, tan, , , ,*3,*4, , } 111…”
Section: Representation Of Additive Expression Tree Modelmentioning
confidence: 99%
“…We have proposed a new representation scheme of the additive tree models for the system identification especially identification of linear/nonlinear ODE system [13,18]. Compared with genetic programming (GP) and gene expression programming (GEP), this model was powerful than the GEP and GP models, both in the aspects of accuracy and runtime.…”
Section: Introductionmentioning
confidence: 99%
“…The additive models resulted from our previously published work, are proposed for the system identification especially the reconstruction of polynomials and the identification of linear/nonlinear systems [10]. Thus we encode the right-hand side of linear and nonlinear mathematic models into a additive tree individual (Please refer to Figure. …”
Section: Representation Of Additive Tree Modelmentioning
confidence: 99%
“…+N, *, /, sin, cos, exp, rlog, x, and R denote the addition, multiplication, protected division, sine, cosine, exponent, protected logarithm, system inputs, and random constant number, respectively, and take N, 2, 2, 1, 1, 1, 1, 0 and 0 arguments [24]. N is an integer number (the maximum number of an ODE terms), I 0 is the instruction set and the root node, and the instructions of other nodes are selected from the instruction set I 1 [20].…”
Section: B Additive Tree Modelmentioning
confidence: 99%
“…N is an integer number (the maximum number of an ODE terms), I 0 is the instruction set and the root node, and the instructions of other nodes are selected from the instruction set I 1 [20]. It is worth mentioning that if the right-hand side of ODEs is the polynomial, the instruction set I 1 can be defined as I 1 = {*2, *3, ..., *n, x 1 , x 2 , ..., x n , R} [24].…”
Section: B Additive Tree Modelmentioning
confidence: 99%