Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence 2020
DOI: 10.24963/ijcai.2020/448
|View full text |Cite
|
Sign up to set email alerts
|

P-KDGAN: Progressive Knowledge Distillation with GANs for One-class Novelty Detection

Abstract: One-class novelty detection is to identify anomalous instances that do not conform to the expected normal instances. In this paper, the Generative Adversarial Networks (GANs) based on encoder-decoder-encoder pipeline are used for detection and achieve state-of-the-art performance. However, deep neural networks are too over-parameterized to deploy on resource-limited devices. Therefore, Progressive Knowledge Distillation with GANs (P-KDGAN) is proposed to learn compact and fast novelty detection network… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(11 citation statements)
references
References 35 publications
0
11
0
Order By: Relevance
“…Zhiwei Zhnag and Lei Sun [7] proposed an algorithm which uses the concept of along Progressive Knowledge Distillation with Generative Adversarial Networks (GANs), where two different GAN models were combined using the distillation loss. They compared this novel approach with OC-SVM (One Class-Support Vector Machine), Kernel Density Estimation (KDE) and Variational Autoencoder (VAE).…”
Section: Literature Surveymentioning
confidence: 99%
“…Zhiwei Zhnag and Lei Sun [7] proposed an algorithm which uses the concept of along Progressive Knowledge Distillation with Generative Adversarial Networks (GANs), where two different GAN models were combined using the distillation loss. They compared this novel approach with OC-SVM (One Class-Support Vector Machine), Kernel Density Estimation (KDE) and Variational Autoencoder (VAE).…”
Section: Literature Surveymentioning
confidence: 99%
“…Progressive Knowledge Distillation (P-KDGAN) Progressive knowledge distillation (P-KDGAN) [105] proposes one-class method designed to distill knowledge from teacher network and transfer it to a student network through a distillation loss. For this purpose, P-KDGAN trains two GAN networks on given one-class training data, namely student and teacher networks.…”
Section: Inter Class Splitting (Ics)mentioning
confidence: 99%
“….00 96. AnoGAN [42], ALOCC [40], ADGAN [11], OCGAN [33], GANomaly [2], P-KDGAN [55] and DGEO [17]. For the semi-supervised anomaly detection approaches, we consider SSAD [19], SS-DGM [20] and Deep SAD [39].…”
Section: Experiments On Natural Imagesmentioning
confidence: 99%
“…The main challenge in anomaly detection is that anomalous samples are diverse and inexhaustible. To bypass the unfeasible task of collect-ing a representative set of anomalous samples, many approaches [40,33,55] resort to unsupervised learning so that only normal samples are needed for model training.…”
Section: Introductionmentioning
confidence: 99%