2002
DOI: 10.1007/3-540-46084-5_179
|View full text |Cite
|
Sign up to set email alerts
|

Support Vector Method for ARMA System Identification: A Robust Cost Interpretation

Abstract: This paper deals with the application of the Support Vector Method (SVM) methodology to the Auto Regressive and Moving Average (ARMA) linear-system identification problem. The SVM-ARMA algorithm for a single-input single-output transfer function is formulated. The relationship between the SVM coefficients and the residuals, together with the embedded estimation of the autocorrelation function, are presented. Also, the effect of the numerical regularization is used to highlight the robust cost character of this… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
78
0
1

Year Published

2004
2004
2021
2021

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 43 publications
(80 citation statements)
references
References 5 publications
1
78
0
1
Order By: Relevance
“…Figure 18 displays the convergence behavior of the state trajectory of the CRNN-GLAD algorithm. We compare the CRNN-GLAD algorithm with the CRNN-LAD algorithm, the ILS algorithm (Zheng 1999), and the SVM algorithm (Rojo-Alvarez et al 2004). The computed results were obtained by averaging 100 independent Monte Carlo simulations.…”
Section: Examplesmentioning
confidence: 99%
“…Figure 18 displays the convergence behavior of the state trajectory of the CRNN-GLAD algorithm. We compare the CRNN-GLAD algorithm with the CRNN-LAD algorithm, the ILS algorithm (Zheng 1999), and the SVM algorithm (Rojo-Alvarez et al 2004). The computed results were obtained by averaging 100 independent Monte Carlo simulations.…”
Section: Examplesmentioning
confidence: 99%
“…In this setting, given observations at time instants , we map these time instants to a higher dimensional ( , possibly infinity) feature space by using a nonlinear transformation , this is, we consider that maps , where a linear approximation to the data can properly fit the observations as follows: (20) for . The optimization criterion is in this case (21) Note that in this case the regularization term is not referred to the amplitude of the base functions of the model as in (11), but rather to the regression vector in the RKHS.…”
Section: Dual Signal Model Algorithmmentioning
confidence: 99%
“…1 SVM were originally stated for classification and regression problems. However, the consideration of different signal models (equation that relates the observation and the data according to a given signal structure for both) has allowed to extend the formulation of SVM algorithms to a number of digital signal processing problems, which are in essence very different from a classification and regression model structure [20]- [22]. SVM algorithms exploit the structural risk minimization (SRM) principle to regularize the model, and use the rather old kernel trick to easily build nonlinear models from linear ones [23].…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…In addition, several signal-processing problems have been specifically formulated from the SVM framework, such as regression [7], non-parametric spectral analysis [10], and auto-regressive moving average (ARMA) system identification [11]. The SVM allows us to control the robustness of time-series modelling when outliers are present, when few data samples are available, or when the assumed model does not accurately match the underlying system.…”
Section: Introductionmentioning
confidence: 99%