UKACC International Conference on Control (CONTROL '98) 1998
DOI: 10.1049/cp:19980312
|View full text |Cite
|
Sign up to set email alerts
|

Support vector machines for system identification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
38
0
1

Year Published

2004
2004
2018
2018

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 80 publications
(41 citation statements)
references
References 0 publications
0
38
0
1
Order By: Relevance
“…Some possible issues associated with SVR application in system identification have been discussed (Drezet and Harrison 1998). Although the SVR has the properties of sparsity control using "-insensitive loss function, it was found that the final SVR model may still have a large size despite the use of "-insensitive loss (Drezet and Harrison 1998;Lee and Billings 2002).…”
Section: Support Vector Regressionmentioning
confidence: 99%
See 2 more Smart Citations
“…Some possible issues associated with SVR application in system identification have been discussed (Drezet and Harrison 1998). Although the SVR has the properties of sparsity control using "-insensitive loss function, it was found that the final SVR model may still have a large size despite the use of "-insensitive loss (Drezet and Harrison 1998;Lee and Billings 2002).…”
Section: Support Vector Regressionmentioning
confidence: 99%
“…Although the SVR has the properties of sparsity control using "-insensitive loss function, it was found that the final SVR model may still have a large size despite the use of "-insensitive loss (Drezet and Harrison 1998;Lee and Billings 2002). There is current research into improving the sparsity of the SVM (Burges 1996;Downs, Gates, and Masters 2001).…”
Section: Support Vector Regressionmentioning
confidence: 99%
See 1 more Smart Citation
“…In the constrained minimization, kernels corresponding to data points that are within the error bounds are removed. The support vector regression (SVR) is formed by the retained kernels [18], and the data points associated with the retained kernels are referred to as the support vectors (SVs). Since the kernels of the SVR are similar to the basis functions of the radial basis function (RBF) network with scatter partitioning, it is shown here that the SVR can be reformulated as a RBF network with basis functions normalized such that they form a partition of unity [19].…”
Section: Introductionmentioning
confidence: 99%
“…The theoretical basis of SVM is the structural risk minimization principle, which gives excellent generalization properties. However, it has been shown that the standard SVM technique is not always able to construct parsimonious models in system identification (Drezet and Harrison, 1998). This shortcoming encourages the exploration of new methods for the parsimonious models under the framework of both SVM and KM.…”
Section: Introductionmentioning
confidence: 99%