2006
DOI: 10.1109/tnn.2006.873281
|View full text |Cite
|
Sign up to set email alerts
|

A geometric approach to Support Vector Machine (SVM) classification

Abstract: Abstract-The geometric framework for the support vector machine (SVM) classification problem provides an intuitive ground for the understanding and the application of geometric optimization algorithms, leading to practical solutions of real world classification problems. In this work, the notion of "reduced convex hull" is employed and supported by a set of new theoretical results. These results allow existing geometric algorithms to be directly and practically applied to solve not only separable, but also non… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
159
0

Year Published

2009
2009
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 323 publications
(159 citation statements)
references
References 16 publications
0
159
0
Order By: Relevance
“…Constructing arrangements is, in turn, an important building block for a variety of algorithms in the field of computational geometry and the optimality of the corresponding algorithm "relies heavily on a nontrivial combinatorial fact" [45,46]. We believe that the derivations provided in this chapter will stimulate the collaboration between both the field of machine learning and the field of computational geometry with respect to these problems (similar to the results [9,12,41,102,150] for standard support vector machines). Note that a comparable research direction exists for the k-means algorithm, which aims at obtaining good candidate solutions for its associated combinatorial optimization task (2.8).…”
Section: Discussionmentioning
confidence: 60%
See 2 more Smart Citations
“…Constructing arrangements is, in turn, an important building block for a variety of algorithms in the field of computational geometry and the optimality of the corresponding algorithm "relies heavily on a nontrivial combinatorial fact" [45,46]. We believe that the derivations provided in this chapter will stimulate the collaboration between both the field of machine learning and the field of computational geometry with respect to these problems (similar to the results [9,12,41,102,150] for standard support vector machines). Note that a comparable research direction exists for the k-means algorithm, which aims at obtaining good candidate solutions for its associated combinatorial optimization task (2.8).…”
Section: Discussionmentioning
confidence: 60%
“…An interesting future research direction is the question if one can obtain exact solutions with infinite accuracy in polynomial time. Note that for hard-margin unsupervised support vector machines, this is the case due to a connection between hard-margin support vector machines and convex hulls [12,70,102,116]. For the soft-margin case, a similar connection to the field of computational geometry is given via the concept of reduced convex hulls [102].…”
Section: Exact Solutions In Less Timementioning
confidence: 99%
See 1 more Smart Citation
“…The reduction factor can be set to di®erent size with the constraint that < 1. 15 It can be seen that the smaller the , the smaller is the size of SCH ðSðX; ÞÞ. It is proved that no matter how the changes, the SCH has the same geometric shape as the original convex hull, which is the reason why we call it as Scaled Convex Hulls.…”
Section: Theory 21 Schmentioning
confidence: 99%
“…In linear SVM, the optimal hyper-plane is the one that minimizes the accuracy error and maximizs the geometric margin. The geometric margin represents the minimum distance of the training samples of both classes from the separating hyper-plane [5].  Neural Network Classifiers: a neural network (NN) classifier consists of units arranged in layers.…”
Section: Introductionmentioning
confidence: 99%