2022
DOI: 10.48550/arxiv.2201.10703
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Anomaly Detection via Reverse Distillation from One-Class Embedding

Abstract: Knowledge distillation (KD) achieves promising results on the challenging problem of unsupervised anomaly detection (AD). The representation discrepancy of anomalies in the teacher-student (T-S) model provides essential evidence for AD. However, using similar or identical architectures to build the teacher and student models in previous studies hinders the diversity of anomalous representations. To tackle this problem, we propose a novel T-S model consisting of a teacher encoder and a student decoder and intro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 31 publications
(87 reference statements)
0
4
0
Order By: Relevance
“…As shown in figure 8 and table 2, the ROC as well as AUROC metrics of CFlow [32], PaDim [33], FastFlow [34], Reverse Distillation [35], DRAM [36], STFPM [37] and PatchCore used in this paper were compared.…”
Section: Results and Analysismentioning
confidence: 99%
“…As shown in figure 8 and table 2, the ROC as well as AUROC metrics of CFlow [32], PaDim [33], FastFlow [34], Reverse Distillation [35], DRAM [36], STFPM [37] and PatchCore used in this paper were compared.…”
Section: Results and Analysismentioning
confidence: 99%
“…For MVTec AD dataset for 2D task, we report the performance comparison of Glance [31], DRAEM [35], DFR [32], R-D [9], PaDim [8], P-SVDD [33], FYD [37], SPADE [7], PANDA [20], CutPaste [18], NSA [28], CFlow [14], FastFlow [34], PatchCore [22] in terms of the image-level and pixel-level metrics. The inference efficiencies of some of these methods are also provided.…”
Section: Resultsmentioning
confidence: 99%
“…Knowledge distillation-based anomaly detection is a newer deep learning method for anomaly detection [24,25]. It uses a teacher-student model to achieve the goal of unsupervised defect detection.…”
Section: Knowledge Distillation-based Anomaly Detectionmentioning
confidence: 99%