ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2021
DOI: 10.1109/icassp39728.2021.9413914
|View full text |Cite
|
Sign up to set email alerts
|

Spatiotemporal Attention for Multivariate Time Series Prediction and Interpretation

Abstract: Multivariate time series modeling and prediction problems are abundant in many machine learning application domains. Accurate interpretation of the prediction outcomes from the model can significantly benefit the domain experts. In addition to isolating the important time-steps, spatial interpretation is also critical to understand the contributions of different variables on the model output. We propose a novel deep learning architecture, called spatiotemporal attention mechanism (STAM) for simultaneous learni… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 39 publications
(26 citation statements)
references
References 27 publications
0
17
0
Order By: Relevance
“…To further verify the performance of the model, this paper compares DMAFD with the following baseline models: ARMA, 24 based CNN, 42 based RNN, 27 AWTM, 16 and STAM. 43 ARMA considers the dependence of time series and the interference of random fluctuations, which is a classic statistical method to solve time series problems. Based CNN obtains an effective representation of the original data through convolutional pooling operations, which is a neural network model commonly used in the field of time series prediction.…”
Section: Baseline Comparisonmentioning
confidence: 99%
See 1 more Smart Citation
“…To further verify the performance of the model, this paper compares DMAFD with the following baseline models: ARMA, 24 based CNN, 42 based RNN, 27 AWTM, 16 and STAM. 43 ARMA considers the dependence of time series and the interference of random fluctuations, which is a classic statistical method to solve time series problems. Based CNN obtains an effective representation of the original data through convolutional pooling operations, which is a neural network model commonly used in the field of time series prediction.…”
Section: Baseline Comparisonmentioning
confidence: 99%
“…To further verify the performance of the model, this paper compares DMAFD with the following baseline models: ARMA, 24 based CNN, 42 based RNN, 27 AWTM, 16 and STAM 43 . ARMA considers the dependence of time series and the interference of random fluctuations, which is a classic statistical method to solve time series problems.…”
Section: Experiments and Analysismentioning
confidence: 99%
“…Compressing all information from the input time-steps into a fixed-length single vector was the major bottleneck for the encoder-decoder model. Temporal attention can be applied for many-to-many time series prediction 24 and many-to-one-prediction 39,40 . The proposed approach (Fig.…”
Section: Model Developmentmentioning
confidence: 99%
“…LSTMs have shown state-of-the-art results in various applications including off-line handwriting recognition 20 , natural language processing 21 and engineering systems 22 . LSTMs have also been used effectively for multivariate time series prediction tasks [23][24][25] . Considering the importance of climate extremes for agricultural predictions, random forest has been utilized to predict grid-cell anomalies-deviations of yields 26 .…”
Section: Introductionmentioning
confidence: 99%
“…Recently, machine learning (ML), has become one of the most powerful tools in the field of multivariate multi-step time series prediction (Hochreiter and Schmidhuber, 1997;Geurts et al, 2006;Sapankevych and Sankar, 2009;Box et al, 2015;Hu and Zheng, 2020;Nian et al, 2021c). Deep learning could be regarded as one of the hottest topics in the context, and all kinds of most emerging and advanced algorithms have been put forward and made progresses (Hinton and Salakhutdinov, 2006;Krizhevsky et al, 2012;Goodfellow et al, 2014;He et al, 2016;Huang et al, 2017;Wan et al, 2019), such as Recurrent Neural Network (RNN) (Elman, 1990;Lipton et al, 2015;Braakmann-Folgmann et al, 2017;Qin et al, 2017) and Long Short Term Memory (LSTM) (Kalchbrenner et al, 2015;Shi et al, 2015;Greff et al, 2016;Zhang et al, 2017;Shi and Yeung, 2018;Wang et al, 2021;Gangopadhyay et al, 2021;Nian et al, 2021b). We expect to establish a comprehensive predictive model of mesoscale eddy trajectories toward meridional ridges on a global scale in the future, coupling with the topographic effects, via deep learning.…”
Section: Introductionmentioning
confidence: 99%