Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery &Amp; Data Mining 2019
DOI: 10.1145/3292500.3330871
|View full text |Cite
|
Sign up to set email alerts
|

Deep Anomaly Detection with Deviation Networks

Abstract: Although deep learning has been applied to successfully address many data mining problems, relatively limited work has been done on deep learning for anomaly detection. Existing deep anomaly detection methods, which focus on learning new feature representations to enable downstream anomaly detection methods, perform indirect optimization of anomaly scores, leading to data-inefficient learning and suboptimal anomaly scoring. Also, they are typically designed as unsupervised learning due to the lack of large-sca… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
168
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 267 publications
(187 citation statements)
references
References 69 publications
(128 reference statements)
1
168
0
Order By: Relevance
“…Each passenger data (x N +1 , X) is fit with an iForest model and a score of x N +1 is given. -DevNet [10]. A new deviation loss based on z-score is proposed for anomaly detection.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…Each passenger data (x N +1 , X) is fit with an iForest model and a score of x N +1 is given. -DevNet [10]. A new deviation loss based on z-score is proposed for anomaly detection.…”
Section: Resultsmentioning
confidence: 99%
“…Conventional methods include isolation Forest [9], and One-Class Support Vector Machine. Present deep learning-based models try to learn a distinguished density space between the normal and fraud data [10,11,15]. DevNet [10] defines a deviation loss based on z-score of a prior Gaussian distribution and squeezes the outliers to the tail of the distribution.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…(39) presents a simplified τ for brevity. The idea of enforcing a prior on the anomaly scores is explored in [116]. Motivated by the extensive empirical results in [75] that show the anomaly scores in a variety of real-world data sets fits Gaussian distribution very well, the work uses a Gaussian prior to encode the anomaly scores and enable the direct optimization of the scores.…”
Section: Prior-driven Modelsmentioning
confidence: 99%