Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery &Amp; Data Mining 2020
DOI: 10.1145/3394486.3403118
|View full text |Cite
|
Sign up to set email alerts
|

Connecting the Dots: Multivariate Time Series Forecasting with Graph Neural Networks

Abstract: Modeling multivariate time series has long been a subject that has attracted researchers from a diverse range of fields including economics, finance, and traffic. A basic assumption behind multivariate time series forecasting is that its variables depend on one another but, upon looking closely, it is fair to say that existing methods fail to fully exploit latent spatial dependencies between pairs of variables. In recent years, meanwhile, graph neural networks (GNNs) have shown high capability in handling rela… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
414
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 730 publications
(419 citation statements)
references
References 16 publications
2
414
0
Order By: Relevance
“…If the model can automatically extract the contact pattern among multiple age groups, the latent dependencies of multivariate time series can be combined to improve the prediction. Our study adopts the adaptive graph learning module [29] , which is illustrated as follows: where represent randomly initialized node embeddings, which are learnable during training; are model parameters; is a hyper-parameter for controlling the saturation rate of the activation function. The adaptive graph learning module measures the similarity between the embeddings of each node and computes the matrix A of the directed graph .…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…If the model can automatically extract the contact pattern among multiple age groups, the latent dependencies of multivariate time series can be combined to improve the prediction. Our study adopts the adaptive graph learning module [29] , which is illustrated as follows: where represent randomly initialized node embeddings, which are learnable during training; are model parameters; is a hyper-parameter for controlling the saturation rate of the activation function. The adaptive graph learning module measures the similarity between the embeddings of each node and computes the matrix A of the directed graph .…”
Section: Methodsmentioning
confidence: 99%
“…There are also several studies that use GNN to integrate population migration data for infectious disease prediction 25 , 26 , 27 , 28 . Different from RNNs, multivariate time series are processed in a graph structure in GNN, so the independence of time series can be defined by interaction weights and be learned in a data-driven way [29] .…”
Section: Introductionmentioning
confidence: 99%
“…Inspired by MTGNN [11], we directly transform the original features X = (x t−w+1 , x t−w+2 , • • • , x t ) ∈ R w×d to the hidden space…”
Section: Multivariate Feature Interaction Modulementioning
confidence: 99%
“…For this, the authors proposed a transformation-gated LSTM (TG-LSTM) and compared the model performance against nine different recurrent and attention-based models. In the recent times, Graph Neural Networks (GNNs) have gained popularity in multivariate time series forecasting where temporal and latent interdependencies between variables can be represented using graphs with nodes as variables and edges representing the dependencies between variables [ 31 , 32 ]. Unarguably generating ahead-of-time forecasts for financial time series is one of the most challenging and widely studied problems in the domain of time series forecasting.…”
Section: Introductionmentioning
confidence: 99%