2007
DOI: 10.1007/s10898-007-9162-0
|View full text |Cite
|
Sign up to set email alerts
|

Quadratic kernel-free non-linear support vector machine

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
34
0
2

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 75 publications
(43 citation statements)
references
References 9 publications
0
34
0
2
Order By: Relevance
“…In this section, we investigate the credit scoring performance of the proposed fuzzy non-kernel QSSVM model on some realworld credit data sets. For comparisons, we mainly test three groups of credit scoring methods: The first group includes commonly-used non-SVM methods such as the logistic regression (denoted by "LOG REG") and feed-forward backpropagation (FFBP) neural networks (denoted by "FFBP NN"); the second group includes commonly-used kernel-based SVM methods such as the soft SVM model with Gaussian kernel (denoted by "SVM GausKer"), the weighted 2-norm SVM model with Gaussian or Quadratic kernel [27] (denoted by "W2NSVM GausKer" or "W2NSVM QuadKer"), well-known FSVM model with within-class scatter and Gaussian kernel [1] (denoted by "FSVMWCS GausKer") and clustered SVM method [11] (denoted by "Clu SVM"); the third group includes two non-kernel SVM models such as Dagher's QSVM model [6] and soft QSSVM model [17]. For all SVM models, the grid method is utilized to search the best penalty parameterη as the following: log 2η ∈ {2, 3, .…”
Section: 4mentioning
confidence: 99%
“…In this section, we investigate the credit scoring performance of the proposed fuzzy non-kernel QSSVM model on some realworld credit data sets. For comparisons, we mainly test three groups of credit scoring methods: The first group includes commonly-used non-SVM methods such as the logistic regression (denoted by "LOG REG") and feed-forward backpropagation (FFBP) neural networks (denoted by "FFBP NN"); the second group includes commonly-used kernel-based SVM methods such as the soft SVM model with Gaussian kernel (denoted by "SVM GausKer"), the weighted 2-norm SVM model with Gaussian or Quadratic kernel [27] (denoted by "W2NSVM GausKer" or "W2NSVM QuadKer"), well-known FSVM model with within-class scatter and Gaussian kernel [1] (denoted by "FSVMWCS GausKer") and clustered SVM method [11] (denoted by "Clu SVM"); the third group includes two non-kernel SVM models such as Dagher's QSVM model [6] and soft QSSVM model [17]. For all SVM models, the grid method is utilized to search the best penalty parameterη as the following: log 2η ∈ {2, 3, .…”
Section: 4mentioning
confidence: 99%
“…Generally, the SVM is used to separate the extracted feature sets into two classes through finding an optimal hyperplane. A study by Dagher [42] presented a quadratic kernel-free non-linear SVM which was used in this research. The quadratic function was utilised to split the feature sets non-linearly as can be found in [42].…”
Section: Support Vector Machinementioning
confidence: 99%
“…A study by Dagher [42] presented a quadratic kernel-free non-linear SVM which was used in this research. The quadratic function was utilised to split the feature sets non-linearly as can be found in [42]. Furthermore, various output codes are examined to solve the multiclass categorisation problem [43].…”
Section: Support Vector Machinementioning
confidence: 99%
“…To take advantage of the idea of SVM while avoiding the challenges in using the nonlinear kernel trick, a kernel-free quadratic surface SVM (QSSVM) model was proposed by Dagher [7]. Luo et al recently developed its extension, the so called soft margin quadratic surface SVM (SQSSVM), that incorporates noise and outliers [21].…”
mentioning
confidence: 99%