2022
DOI: 10.1109/lsp.2022.3224880
|View full text |Cite
|
Sign up to set email alerts
|

Multivariate Time Series Imputation With Transformers

Abstract: Processing time series with missing segments is a fundamental challenge that puts obstacles to advanced analysis in various disciplines such as engineering, medicine, and economics. One of the remedies is imputation to fill the missing values based on observed values properly without undermining performance. We propose the Multivariate Time-Series Imputation with Transformers (MTSIT), a novel method that uses transformer architecture in an unsupervised manner for missing value imputation. Unlike the existing t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 23 publications
(6 citation statements)
references
References 34 publications
(31 reference statements)
0
1
0
Order By: Relevance
“…On the other hand, considering that related works used other datasets, and each dataset presents different characteristics, the comparison is carried out only for reference. According to Table 9, it can be seen that in terms of RMSE, the proposal is only below the work [20], which obtained an RMSE of 3.756 ug/m 3 ; in terms of R 2 , the proposal with R 2 of 0.6946 is below the work [22], which reported an R 2 of 0.895; and in terms of MAE, the proposal with MAE = 3.4944 ug/m 3 exceeded the work [21] with MAE = 8.31 ug/m 3 .…”
Section: Discussionmentioning
confidence: 86%
“…On the other hand, considering that related works used other datasets, and each dataset presents different characteristics, the comparison is carried out only for reference. According to Table 9, it can be seen that in terms of RMSE, the proposal is only below the work [20], which obtained an RMSE of 3.756 ug/m 3 ; in terms of R 2 , the proposal with R 2 of 0.6946 is below the work [22], which reported an R 2 of 0.895; and in terms of MAE, the proposal with MAE = 3.4944 ug/m 3 exceeded the work [21] with MAE = 8.31 ug/m 3 .…”
Section: Discussionmentioning
confidence: 86%
“…• BRITS: BRITS [21] imputes missing values using a bidirectional RNN and considers correlation among different missing values. • MTSIT: MTSIT [6] is a transformer-based imputation model in an unsupervised scheme for multivariate timeseries imputation. • SAITS: SAITS [10] is based on a multi-head selfattention module and imputes the missing values using weighted combination blocks.…”
Section: B Baseline Methodsmentioning
confidence: 99%
“…SAITS [10] proposed a diagonally-masked self-attention block to generate missing values effectively. MTSIT [6] proposed a way to train a transformer architecture in an unsupervised manner. Imputation methods using generative models [13], [16], [17], [22], [35] also have made significant advances.GAN [36] guides a model to learn the distribution of the original data and generate data with a similar distribution, thereby allowing the model to learn imputation.…”
Section: Related Workmentioning
confidence: 99%
“…To ensure a fair comparison, we maintained the hyperparameters recommended in the source papers if they use the same dataset as ours; otherwise, we keep them consistent with the hyperparameters of our model. weighted average of its neighbors; (4) MRNN [33]: using a multidirectional recurrent neural network to insert missing values and estimate them across the data stream; (5) BRITS [3]: this method uses a bi-directional LSTM with history regression and feature regression to estimate missing values; (6) Transformer: using the Transformer's encoder for missing value estimation; (7) MTSIT [32]: using learnable position encoding and the Transformer's encoder for missing value estimation; (8) SAITS [7]: using two diagonal-masked multihead attention modules for joint reconstruction.…”
Section: Counterpartsmentioning
confidence: 99%
“…Transformer-based methods [15,30] mainly focus on the use and improvement of self-attention mechanisms [22,28]. [32] proposed an unsupervised autoencoder model named MTSIT based on Transformer, which jointly reconstructs and computes multivariate time series using unlabeled data. Regarding spatiotemporal data [19], Cross-Dimensional Self-Attention (CDSA) [18] was proposed, which is an effective imputation method that not only captures temporal dependencies but also leverages the geographic relationships among sensors to fill in missing values in time series data.…”
Section: Introductionmentioning
confidence: 99%