2021
DOI: 10.1155/2021/6675218
|View full text |Cite
|
Sign up to set email alerts
|

An Efficient Feature Weighting Method for Support Vector Regression

Abstract: Support vector regression (SVR) is a powerful kernel-based method which has been successfully applied in regression problems. Regarding the feature-weighted SVR algorithms, its contribution to model output has been taken into account. However, the performance of the model is subject to the feature weights and the time consumption on training. In the paper, an efficient feature-weighted SVR is proposed. Firstly, the value constraint of each weight is obtained according to the maximal information coefficient whi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 18 publications
0
1
0
Order By: Relevance
“…Support Vector Regression (SVR) is a generalization of Support Vector Machine (SVM) that incorporates regression functions into SVM to solve regression problems [ 39 , 40 ]. As a supervised machine learning algorithm, SVR has a high capability in regression modeling [ 41 , 42 ]. SVR is a kernel-based technique in which the kernel function projects the input data into higher-dimensional feature space to find the hyperplane with the lowest error margin and the best fit to the regression line [ 43 , 44 ].…”
Section: Introductionmentioning
confidence: 99%
“…Support Vector Regression (SVR) is a generalization of Support Vector Machine (SVM) that incorporates regression functions into SVM to solve regression problems [ 39 , 40 ]. As a supervised machine learning algorithm, SVR has a high capability in regression modeling [ 41 , 42 ]. SVR is a kernel-based technique in which the kernel function projects the input data into higher-dimensional feature space to find the hyperplane with the lowest error margin and the best fit to the regression line [ 43 , 44 ].…”
Section: Introductionmentioning
confidence: 99%