2019
DOI: 10.1016/j.neucom.2018.11.099
|View full text |Cite
|
Sign up to set email alerts
|

Cost-sensitive support vector machines

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
60
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 157 publications
(60 citation statements)
references
References 20 publications
0
60
0
Order By: Relevance
“…Precisely, cost-sensitive learning methods introduce a cost of misclassification to weigh differently the classification errors in different classes [78]. In particular, in this paper we applied the cost-sensitive svm, also known as ''biased penalties svm'' [81]. More in detail, in the framework of the svm optimization problem, two regularization parameters, θ + and θ − , were introduced to be able to adjust the cost of misclassification of negative and positive samples.…”
Section: ) Knn Classifiers and Support Vector Machines For Predictionmentioning
confidence: 99%
See 1 more Smart Citation
“…Precisely, cost-sensitive learning methods introduce a cost of misclassification to weigh differently the classification errors in different classes [78]. In particular, in this paper we applied the cost-sensitive svm, also known as ''biased penalties svm'' [81]. More in detail, in the framework of the svm optimization problem, two regularization parameters, θ + and θ − , were introduced to be able to adjust the cost of misclassification of negative and positive samples.…”
Section: ) Knn Classifiers and Support Vector Machines For Predictionmentioning
confidence: 99%
“…c) the data unbalancing. To cope with this problem we test two different techniques; the first one is the training set subsampling described in Subsection b), while the second is the use of cost sensitive support vector machines [81] (see Subsection III.A.3). d) the choice of the number of neighbors to be considered.…”
Section: ) Missing Data Imputationmentioning
confidence: 99%
“…Machine learning (ML) has been strongly applied to solve supervised and unsupervised problems. ML deploys different algorithms, such as online learning, multi-task learning and supervised algorithms, including rule based [8,9], function based [10,11], lazy [12], and bootstrap [13]. Some of them are used to transform data, special example would be dimension reduction for optimization, some to build classifiers like supervised algorithms, others for prediction like regression, etc.…”
Section: Machine Learning : Challenges and Drawbacksmentioning
confidence: 99%
“…(10) Equation 6is the Euclidean distance between samples. 7represents the sum of Euclidean distances from one sample to other samples, and (8) and (9) represent the first initial cluster center and the second initial cluster center, respectively. Specifically, (10) is a recursive formula for the initial cluster center and 3 jk , k represents the number of clusters.…”
Section: Optimize Cluster Center Selectionmentioning
confidence: 99%
“…In addition to the resampling method, some scholars have proposed some effective methods from the algorithm level. In [9] proposed an cost-sensitive learning algorithm CS-SVM, which integrates the idea of cost-sensitive learning into the SVM classifier, and optimizes the maximum separation surface of SVM for different classes. The samples are assigned different penalty factors, which improve the SVM's ability to classify unbalanced data.…”
Section: Introductionmentioning
confidence: 99%