2016 23rd International Conference on Pattern Recognition (ICPR) 2016
DOI: 10.1109/icpr.2016.7899670
|View full text |Cite
|
Sign up to set email alerts
|

One-class slab support vector machine

Abstract: Abstract-This work introduces the one-class slab SVM (OC-SSVM), a one-class classifier that aims at improving the performance of the one-class SVM. The proposed strategy reduces the false positive rate and increases the accuracy of detecting instances from novel classes. To this end, it uses two parallel hyperplanes to learn the normal region of the decision scores of the target class. OCSSVM extends one-class SVM since it can scale and learn non-linear decision functions via kernel methods. The experiments on… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 22 publications
0
4
0
Order By: Relevance
“…B. We adopted the one vs rest [19] strategy which consists of considering a class c i ∈ C as a target one and the others as outliers [29,27,19, Australian 2 14 690 Diabetes 2 8 268 Ionosphere 2 34 351 Iris 3 4 150 Satimage 6 36 4435 Segment 7 19 2310 Table 2: Benchmark datasets [33,34] 30, 31,32]. In this case, the outliers injected in a given data subset were randomly picked among the representatives of the outlier classes, i.e., C\c i .…”
Section: Benchmark Datasetsmentioning
confidence: 99%
See 1 more Smart Citation
“…B. We adopted the one vs rest [19] strategy which consists of considering a class c i ∈ C as a target one and the others as outliers [29,27,19, Australian 2 14 690 Diabetes 2 8 268 Ionosphere 2 34 351 Iris 3 4 150 Satimage 6 36 4435 Segment 7 19 2310 Table 2: Benchmark datasets [33,34] 30, 31,32]. In this case, the outliers injected in a given data subset were randomly picked among the representatives of the outlier classes, i.e., C\c i .…”
Section: Benchmark Datasetsmentioning
confidence: 99%
“…Benchmark datasets[33,34] 30,31,32]. In this case, the outliers injected in a given data subset were randomly picked among the representatives of the outlier classes, i.e., C\c i .…”
mentioning
confidence: 99%
“…We first present a simplified variant of this objective by using two different (sets of) hyperplanes, dubbed Basic One-class Discriminative Subspaces (BODS), that can sandwich the labeled data by bounding from different sides; these hyperplanes are independently parameterized and thus can be oriented differently to better fit to the labeled data. Note that there is a similar prior work, termed Slab-SVM [18], that learns two hyperplanes for one-class classification. However, these hyperplanes are constrained to have the same slope, which we do not impose in our BODS model, as a result, our model is more general than Slab-SVM.…”
Section: Background and Related Workmentioning
confidence: 99%
“…2(a)); these hyperplanes are independently parameterized and bounds the data distribution. Note that there is a similar prior work, termed Slab-SVM [89], that learns two hyperplanes for one-class classification. However, their hyperplanes are constrained to have the same slope, which we do not impose in our BODS model; as a result, our model is more flexible than Slab-SVM.…”
Section: Introductionmentioning
confidence: 99%