2022
DOI: 10.1016/j.patcog.2022.108945
|View full text |Cite
|
Sign up to set email alerts
|

Do deep neural networks contribute to multivariate time series anomaly detection?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
21
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 49 publications
(50 citation statements)
references
References 21 publications
0
21
0
Order By: Relevance
“…USAD 30 : This method performs adversarial training inspired by generative algorithms, enabling it to learn how to amplify the reconstruction error of anomalous inputs.…”
Section: Methodsmentioning
confidence: 99%
“…USAD 30 : This method performs adversarial training inspired by generative algorithms, enabling it to learn how to amplify the reconstruction error of anomalous inputs.…”
Section: Methodsmentioning
confidence: 99%
“…With the presence of anomalies in the training set, this threshold turns out to be too high, and thus prevents the models from detecting all the anomalies. Previous works such as [16], [24], tested different thresholds and reported those that led to the best F1-score which explains why they got better results. On the other hand, the performance of iForest model reaches the same performance level as OC-SVM but decreases afterwards with the increase of the contamination ratio.…”
Section: A Performance With the Mixed-datasetmentioning
confidence: 97%
“…They categorized different approaches based on the nature of the input data, the type of anomaly (referred to as outlier in the survey) and the type of method. Focusing on unsupervised learning approaches for multivariate time-series data, Audibert et al [16], proposed a different taxonomy by classifying the methods into conventional methods, machine-learning methods and deep-learning methods. The conventional methods, such as VAR (Vector Auto Regressive) model, PCA (Principal Components Analysis) or SSA (Singular Spectrum Analysis), rely on statistical rules or assume that the time-series data come from a (linear) model whose parameters are estimated.…”
Section: B Unsupervised Learning Models For Anomaly Detectionmentioning
confidence: 99%
“…Because of these circumstances, an unsupervised anomaly detection approach was chosen. Unsupervised anomaly detection is done under the assumption that the training set Γ ⊂ Φ, whereas Φ is the set of all available samples , contains mostly samples from normal operations but could contain a small number of still unknown anomalies [13], [40]. This training set is constructed using the outlierness calculated with the KPI [13] for this reason, we call it KPI-supervised training as the anomalous samples are excluded and autolabeled using the outcome of the KPI classifier.…”
Section: Kpi-supervised Trainingmentioning
confidence: 99%
“…Especially in anomaly detection, benchmarks need to be treated with care, since most suffer from one or all of the following issues [12]: the problem is too trivial and can be solved in a few lines of code; the anomaly density is too high; if supervised learning is used, the ground truth is mislabeled and often there is no data available after the anomaly has occurred, this the authors of [12] called the run to failure bias. The study done on the comparison of classical pure ML methods, pure (deep) machine learning methods, and the combination of both, hybrid machine learning, suggests that there is no pure machine learning method that can outperform a hybrid system [13].…”
Section: Introductionmentioning
confidence: 99%