2019
DOI: 10.1109/lsp.2018.2889273
|View full text |Cite
|
Sign up to set email alerts
|

One-Class Convolutional Neural Network

Abstract: We present a novel Convolutional Neural Network (CNN) based approach for one class classification. The idea is to use a zero centered Gaussian noise in the latent space as the pseudo-negative class and train the network using the crossentropy loss to learn a good representation as well as the decision boundary for the given class. A key feature of the proposed approach is that any pre-trained CNN can be used as the base network for one class classification. The proposed One Class CNN (OC-CNN) is evaluated on t… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
82
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
5
3
1

Relationship

2
7

Authors

Journals

citations
Cited by 149 publications
(91 citation statements)
references
References 35 publications
0
82
0
Order By: Relevance
“…The former is based on CNN for one-class classification problems. Its idea is to use a zero centered Gaussian noise in the latent space as the pseudo-negative class and train the convolutional network using the cross-entropy loss to learn a good representation and the decision boundary for a given class [57]. CNN has been widely applied to computationally complex classification tasks, such as image defect detection [80] and face verification [56].…”
Section: Other Classifiersmentioning
confidence: 99%
“…The former is based on CNN for one-class classification problems. Its idea is to use a zero centered Gaussian noise in the latent space as the pseudo-negative class and train the convolutional network using the cross-entropy loss to learn a good representation and the decision boundary for a given class [57]. CNN has been widely applied to computationally complex classification tasks, such as image defect detection [80] and face verification [56].…”
Section: Other Classifiersmentioning
confidence: 99%
“…Assuming that the extracted features are D-dimensional, the features are appended with the pseudo-negative data generated from a Gaussian, N (µ, σ · I), similar to [31]. Following We use a simple one layer fully connected classifier network (C) with sigmoid activation at the end, as shown in Fig.…”
Section: B Classification Networkmentioning
confidence: 99%
“…This approach is extended in [36] by replacing the hyperplane with an hyper-sphere, and then in [32] by introducing deep neural architectures in these models. Later on, [5] transposes the original one-class SVM model completely into a deep neural architecture, and [28] projects a similar neural architectures as a supervised model by generating pseudo-labels for negative samples.…”
Section: Related Workmentioning
confidence: 99%