2009
DOI: 10.1016/j.neucom.2008.05.003
|View full text |Cite
|
Sign up to set email alerts
|

Minimum spanning tree based one-class classifier

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
75
0
2

Year Published

2014
2014
2021
2021

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 122 publications
(80 citation statements)
references
References 26 publications
0
75
0
2
Order By: Relevance
“…The details of the datasets are given in Table 1. The proposed method is compared to that described by Kim, et al [6] and one-class classifiers, such as Gaussian Data Description (GAUSSDD) [12], k-Means Data Description (KMEANSDD) [12], Minimum Spanning Tree Data Description (MSTDD) [13], k-Nearest Neighbor Data Description (KNNDD) [14], Support Vector Data Description (SVDD) [15], Linear Programming Data Description (LPDD) [16] and Minimax Probability Machine (MPMDD) [17]. For the method [6], we use the word score that is the result of finding the best match for the outlier detection, and SEG_T is assigned to integer numbers in the range [5], [15].…”
Section: Methodsmentioning
confidence: 99%
“…The details of the datasets are given in Table 1. The proposed method is compared to that described by Kim, et al [6] and one-class classifiers, such as Gaussian Data Description (GAUSSDD) [12], k-Means Data Description (KMEANSDD) [12], Minimum Spanning Tree Data Description (MSTDD) [13], k-Nearest Neighbor Data Description (KNNDD) [14], Support Vector Data Description (SVDD) [15], Linear Programming Data Description (LPDD) [16] and Minimax Probability Machine (MPMDD) [17]. For the method [6], we use the word score that is the result of finding the best match for the outlier detection, and SEG_T is assigned to integer numbers in the range [5], [15].…”
Section: Methodsmentioning
confidence: 99%
“…Gaussian model (Gauss), Mixture of Gaussians (MoG), Parzen Density Estimation (PDE) have been chosen as density methods. The boundary methods selected are k-centers (kC), k-nearest neighbor (kNN), k-means (kM) [30], Minimum Spanning Trees (MST) [31] and SVDD. 3 These methods are the most commonly used in the literature for one-class classification problems.…”
Section: Validation Methodologymentioning
confidence: 99%
“…Estimating the complete density or structure of a target concept in a one-class problem can very often be too demanding or even impossible. Boundary methods instead concentrate on estimating only the closed boundary for the given data, assuming that such a boundary will sufficiently describe the target class [28].…”
Section: Learning In the Absence Of Counterexamplesmentioning
confidence: 99%