2022 Joint 12th International Conference on Soft Computing and Intelligent Systems and 23rd International Symposium on Advanced 2022
DOI: 10.1109/scisisis55246.2022.10002131
|View full text |Cite
|
Sign up to set email alerts
|

Combining Transformer with a Discriminator for Anomaly Detection in Multivariate Time Series

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
0
0
Order By: Relevance
“…To our knowledge, ours is the first work to simplify and introduce this attention mechanism to multivariate temporal anomaly detection. Hence, we can obtain the values of all features at any moment t by a feedforward neural network (FNN) [11] based on historical serial time window data as…”
Section: Temporal Dependency Modelingmentioning
confidence: 99%
See 1 more Smart Citation
“…To our knowledge, ours is the first work to simplify and introduce this attention mechanism to multivariate temporal anomaly detection. Hence, we can obtain the values of all features at any moment t by a feedforward neural network (FNN) [11] based on historical serial time window data as…”
Section: Temporal Dependency Modelingmentioning
confidence: 99%
“…With its outstanding performance in sequence modeling and prediction tasks, numerous anomaly detection methods based on Transformer [9] have been proposed in recent years [10,11]. Wiederer incorporated an external attention mechanism on top of self-attention to model the correlation among multivariate time-series, and proposed a regularization-based method to constrain model parameters and prevent overfitting [12].…”
Section: Introductionmentioning
confidence: 99%