Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery &Amp; Data Mining 2019
DOI: 10.1145/3292500.3330672
|View full text |Cite
|
Sign up to set email alerts
|

Robust Anomaly Detection for Multivariate Time Series through Stochastic Recurrent Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
476
0
4

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 751 publications
(563 citation statements)
references
References 15 publications
0
476
0
4
Order By: Relevance
“…Concatenate t (1) , t (2) , t (3) and t (4) , and express as t = [t (1) , t (2) , t (3) , t (4) ], then a data set including 4 known subsequences is generated by the following formula…”
Section: A Simulation Setupmentioning
confidence: 99%
See 1 more Smart Citation
“…Concatenate t (1) , t (2) , t (3) and t (4) , and express as t = [t (1) , t (2) , t (3) , t (4) ], then a data set including 4 known subsequences is generated by the following formula…”
Section: A Simulation Setupmentioning
confidence: 99%
“…T Here has been a growing recognition of the value of time series data because many applications, such as anomaly detection [1], [2], working condition perception [3], [4] and soft measurement [5], [6], [7], are being enabled by the rapid development of data driven algorithms. Time series segmentation is to partition a given time series into subsequences which are internally homogeneous [8].…”
Section: Introductionmentioning
confidence: 99%
“…Zhang et al [16] detect fraud on reconstructed residual correlation matrix. LSTM-NDT [6] and OmniAnomaly [12] adopt dynamic thresholding on reconstructed feature errors. Despite great effectiveness, these approaches tend to underperform in our task due to high intra-class variance of normal data, where one passenger's normal records can be irregular/fraudulent for others.…”
Section: Related Workmentioning
confidence: 99%
“…Stochastic gradient variational bayes (SGVB) [21] is a variational inference algorithm commonly applied in VAE to tune the parameters φ and θ and maximizes the evidence of lower bound (ELBO) [29], L(x t ):…”
Section: Basics Of a Variant Of Rnn -Gru Vae Sgvb And Planar Nfmentioning
confidence: 99%
“…Second, VAE is applied to map observations (i.e., input observations in x−space) to stochastic (i.e., z−space) variables. Third, to model temporal dependence among stochastic variables explicitly in the latent space, a connection technique called linear Gaussian state-space model (SSM) is applied to connect between stochastic variables and the concatenation of stochastic variables and GRU hidden variables [29]. Lastly, to help stochastic variables in qnet (described hereafter) capture complex distributions of input data, Planar NF is also applied to grasp non-Gaussian posterior distributions in latent stochastic space.…”
Section: Basics Of a Variant Of Rnn -Gru Vae Sgvb And Planar Nfmentioning
confidence: 99%