2020 IEEE International Conference on Big Data (Big Data) 2020
DOI: 10.1109/bigdata50022.2020.9378408
|View full text |Cite
|
Sign up to set email alerts
|

GLIMA: Global and Local Time Series Imputation with Multi-directional Attention Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 17 publications
0
8
0
Order By: Relevance
“…We demonstrate that CSDI also provides accurate deterministic imputations, which are obtained as the median of 100 generated samples. We compare CSDI with four baselines developed for deterministic imputation including GLIMA [20], which combined recurrent imputations with an attention mechanism to capture temporal and feature dependencies and showed the state-of-the-art performance. These methods are based on autoregressive models.…”
Section: Results Of Deterministic Imputationmentioning
confidence: 99%
See 3 more Smart Citations
“…We demonstrate that CSDI also provides accurate deterministic imputations, which are obtained as the median of 100 generated samples. We compare CSDI with four baselines developed for deterministic imputation including GLIMA [20], which combined recurrent imputations with an attention mechanism to capture temporal and feature dependencies and showed the state-of-the-art performance. These methods are based on autoregressive models.…”
Section: Results Of Deterministic Imputationmentioning
confidence: 99%
“…Subsequent studies combined RNNs with other methods to improve imputation performance, such as GANs [9,17,18] and self-training [19]. Among them, the combination of RNNs with attention mechanisms is particularly successful for imputation and interpolation of time series [20,21]. While these methods focused on deterministic imputation, GP-VAE [10] has been recently developed as a probabilistic imputation method.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Despite the prevalence of the self-attention mechanism in many domains, its application to time-series imputation is still being explored. Suo et al [32] propose an imputation framework that learns the distant correlations across time and the dependencies of multivariate time series locally and globally using a multi-dimensional attention mechanism. Du et al [33] propose the method of a self-attention-based mechanism for missing values in multivariate time series (SAITS).…”
Section: Time-series Imputation Via Neural Networkmentioning
confidence: 99%