2008
DOI: 10.1504/ijmic.2008.020543
|View full text |Cite
|
Sign up to set email alerts
|

Model structure selection using an integrated forward orthogonal search algorithm assisted by squared correlation and mutual information

Abstract: Published paperMappin Street, Sheffield, S1 3JD, UK Abstract: Model structure selection plays a key role in nonlinear system identification. The first step in nonlinear system identification is to determine which model terms should be included in the model. Once significant model terms have been determined, a model selection criterion can then be applied to select a suitable model subset. The well known orthogonal least squares type algorithms are one of the most efficient and commonly used techniques for mode… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
76
0
2

Year Published

2008
2008
2016
2016

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 90 publications
(78 citation statements)
references
References 82 publications
(87 reference statements)
0
76
0
2
Order By: Relevance
“…The well-known orthogonal least squares (OLS) type of algorithms Aguirre and Billings, 1995;Zhu and Billings, 1996;Wei and Billings, 2008) have been proven to be very effective to deal with multiple dynamical regression problems, which involve a great number of candidate model terms or regressors that may be highly correlated. In the present study, the OLS algorithm given in , is used to solve the regression equation (4).…”
Section: Model Identification and Parameter Estimationmentioning
confidence: 99%
“…The well-known orthogonal least squares (OLS) type of algorithms Aguirre and Billings, 1995;Zhu and Billings, 1996;Wei and Billings, 2008) have been proven to be very effective to deal with multiple dynamical regression problems, which involve a great number of candidate model terms or regressors that may be highly correlated. In the present study, the OLS algorithm given in , is used to solve the regression equation (4).…”
Section: Model Identification and Parameter Estimationmentioning
confidence: 99%
“…Similar to other constructive algorithms, models produced by the OPP algorithm may, however, be highly redundant. To remove or reduce redundancy, a forward orthogonal regression (FOR) learning algorithm [Billings & Wei, 2007a;Wei & Billings, 2007], implemented using a mutual information estimation method, is applied to refine and improve the initially generated model by the OPP algorithm.…”
Section: Constructing the Gcnn Modelmentioning
confidence: 99%
“…However, like the conventional projection pursuit regression algorithm, the OPP algorithm may produce redundant models. To refine and improve the OPP produced network models, the forward orthogonal regression (FOR) learning algorithm, assisted by a mutual information method [Billings & Wei, 2007a;Wei & Billings, 2007], is then applied to remove any severe redundancy. …”
Section: The Opp Algorithm For Coarse Model Identificationmentioning
confidence: 99%
See 2 more Smart Citations