2010
DOI: 10.1002/cem.1288
|View full text |Cite
|
Sign up to set email alerts
|

Disjoint hard models for classification

Abstract: The paper describes a new approach for disjoint hard modelling of classes. This involves developing independent PC models for each group in the class, and calculating both the Q statistic (square prediction error) for each sample to the class model and a separate statistic about how well samples are classified within the projected PC space. The latter statistic can be applied to different types of classifiers, in this paper we choose to illustrate by Quadratic Discriminant Analysis (D statistic) and one class … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2010
2010
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 49 publications
(105 reference statements)
0
8
0
Order By: Relevance
“…This does not prevent further groups being modelled, but the projection would be into the PC space of the original training set. The extension to hard one-class classifiers is conceivable [21]: in such circumstances, each class is modelled separately, but an unknown sample is unambiguously assigned to a single class, the one it most resembles. Sometimes SIMCA is used in this way; for example, although three classes may be independently modelled, a sample is assigned to the class whose distance is closest to the unknown, although the originators did not intend it to be employed in this manner.…”
Section: Modelling and Discriminating Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…This does not prevent further groups being modelled, but the projection would be into the PC space of the original training set. The extension to hard one-class classifiers is conceivable [21]: in such circumstances, each class is modelled separately, but an unknown sample is unambiguously assigned to a single class, the one it most resembles. Sometimes SIMCA is used in this way; for example, although three classes may be independently modelled, a sample is assigned to the class whose distance is closest to the unknown, although the originators did not intend it to be employed in this manner.…”
Section: Modelling and Discriminating Methodsmentioning
confidence: 99%
“…Further details have been described elsewhere [16][17][18][19][20][21]. The dataset consists of the thermal profiles of 293 samples, involving monitoring the change in physical properties as they are heated.…”
Section: Case Study 2: Characterisation Of Plastics Using Dynamic Mecmentioning
confidence: 99%
See 1 more Smart Citation
“…Support vector domain description (SVDD) [11,[15][16][17][39][40][41]] is a modified version of support vector machines (SVMs) [42][43][44][45][46][47]. The difference between SVDD and SVMs is that the latter method computes a boundary between two or more classes, whereas, for SVDD, a boundary is drawn around each individual class separately resulting in a separate (and different) model for each class; therefore, it is sometimes called a 'one class classifier' .…”
Section: Support Vector Domain Description For Mspcmentioning
confidence: 99%
“…The mathematical basis of SVDD is well described elsewhere [11,[15][16][17][39][40][41]. In this paper, a radial basis function (RBF) was used for the kernel since it requires only one parameter to define it [40][41].…”
Section: Support Vector Domain Description For Mspcmentioning
confidence: 99%