2019
DOI: 10.48550/arxiv.1905.07892
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Learning Ensembles of Anomaly Detectors on Synthetic Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 18 publications
0
2
0
Order By: Relevance
“…The classes were highly skewed, with only 89 recurrent strokes recorded within 1 year (7.49%). Five machine learning approaches were deployed to handle this: random undersampling of the majority class, 21 synthetic oversampling of the minority class using the Synthetic Minority Oversampling Technique algorithm, 21 cost-sensitive learning 21 (assigning higher penalty to misclassification of the minority class, COST), anomaly detection algorithms (treating minority class as an anomaly to be detected), 22 and balanced learning algorithms, 23 where class balancing is embedded in the learning algorithm.…”
Section: Model Development and Evaluationmentioning
confidence: 99%
“…The classes were highly skewed, with only 89 recurrent strokes recorded within 1 year (7.49%). Five machine learning approaches were deployed to handle this: random undersampling of the majority class, 21 synthetic oversampling of the minority class using the Synthetic Minority Oversampling Technique algorithm, 21 cost-sensitive learning 21 (assigning higher penalty to misclassification of the minority class, COST), anomaly detection algorithms (treating minority class as an anomaly to be detected), 22 and balanced learning algorithms, 23 where class balancing is embedded in the learning algorithm.…”
Section: Model Development and Evaluationmentioning
confidence: 99%
“…For time series data, however, artificial anomalies and related data augmentation techniques have not been studied extensively. Smolyakov et al (2019) used artificial anomalies to select thresholds in ensembles of anomaly detection models. Most closely related to our approach, SR- CNN Ren et al (2019) trains a supervised CNN on top of an unsupervised anomaly detection model (SR), by using labels from injected single point outliers.…”
Section: −(1 − Ymentioning
confidence: 99%