2018
DOI: 10.1080/21642583.2018.1496042
|View full text |Cite
|
Sign up to set email alerts
|

Nonlinear predictive model selection and model averaging using information criteria

Abstract: This paper is concerned with the model selection and model averaging problems in system identification and data-driven modelling for nonlinear systems. Given a set of data, the objective of model selection is to evaluate a series of candidate models and determine which one best presents the data. Three commonly used criteria, namely, Akaike information criterion, Bayesian information criterion and an adjustable prediction error sum of squares (APRESS) are investigated and their performance in model selection a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8
1
1

Relationship

2
8

Authors

Journals

citations
Cited by 27 publications
(9 citation statements)
references
References 56 publications
0
9
0
Order By: Relevance
“…MSE_root indicates the error value in correlation analysis. AIC is the value suitability to the equation to determine the release kinetics. The results of DDSolver processing are presented in Table and Figure .…”
Section: Resultsmentioning
confidence: 99%
“…MSE_root indicates the error value in correlation analysis. AIC is the value suitability to the equation to determine the release kinetics. The results of DDSolver processing are presented in Table and Figure .…”
Section: Resultsmentioning
confidence: 99%
“…Rsqr_adj shows the correlation between dissolution time and release of 3CH 2 Cl. MSE_root determinates the correlation analysis correction, while the Akaike Information Criterion (AIC) demonstrated the suitability of the equation for determining the release kinetics model. The results of the DDSolver analysis are shown in Table and Figure (for details see Supporting Information Figures S1–S4).…”
Section: Resultsmentioning
confidence: 99%
“…Assume that at the (m-1)th step, a subset 1  m D , consisting of (m-1) significant bases, The model residual n r will be used to form a criterion for model selection, and the search procedure will be terminated when the norm Finally, a mean square error (MSE) based algorithm, e.g. Akaine's information criterion (AIC), Bayesian information criterion, generalized cross-validation (GCV) and adjustable prediction error sum of squares (APRESS) can be used to determine the model size [40].…”
Section: A1 the Forward Orthogonal Regression Proceduresmentioning
confidence: 99%