2010 IEEE International Conference on Data Mining Workshops 2010
DOI: 10.1109/icdmw.2010.57
|View full text |Cite
|
Sign up to set email alerts
|

From Convex to Nonconvex: A Loss Function Analysis for Binary Classification

Abstract: Problems of data classification can be studied in the framework of regularization theory as ill-posed problems. In this framework, loss functions play an important role in the application of regularization theory to classification. In this paper, we review some important convex loss functions, including hinge loss, square loss, modified square loss, exponential loss, logistic regression loss, as well as some non-convex loss functions, such as sigmoid loss, ϕ-loss, ramp loss, normalized sigmoid loss, and the lo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
29
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 36 publications
(30 citation statements)
references
References 19 publications
1
29
0
Order By: Relevance
“…It is possible to first design a nice Fisher consistent loss function and then construct the corresponding classifier [14]. A Fisher consistent smoothed 0-1 loss function for binary classification, constructed the corresponding classification algorithms and the experimental results show some good properties of the new algorithm [13]. Following the same line, we will extend this work to multiclass classification problems.…”
mentioning
confidence: 95%
See 4 more Smart Citations
“…It is possible to first design a nice Fisher consistent loss function and then construct the corresponding classifier [14]. A Fisher consistent smoothed 0-1 loss function for binary classification, constructed the corresponding classification algorithms and the experimental results show some good properties of the new algorithm [13]. Following the same line, we will extend this work to multiclass classification problems.…”
mentioning
confidence: 95%
“…It has been demonstrated that Quasi Secant Method (QSM) [18] outperforms some traditional local optimization methods for binary classification problem [13]. In this algorithm, QSMis adopted as the optimization solver.…”
Section: Single Machine Approach For Multiclass Data Classificationmentioning
confidence: 99%
See 3 more Smart Citations