2018
DOI: 10.1007/978-3-030-03991-2_62
|View full text |Cite
|
Sign up to set email alerts
|

Discovering Granger-Causal Features from Deep Learning Networks

Abstract: In this research, we propose deep networks that discover Granger causes from multivariate temporal data generated in financial markets. We introduce a Deep Neural Network (DNN) and a Recurrent Neural Network (RNN) that discover Granger-causal features for bivariate regression on bivariate time series data distributions. These features are subsequently used to discover Grangercausal graphs for multivariate regression on multivariate time series data distributions. Our supervised feature learning process in prop… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
2
2
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 9 publications
0
5
0
Order By: Relevance
“…The AUC was employed as a performance metric. Since SS-GC is an explicitly multi-scale method [31], SS-GC estimations were repeated at 18 different scales (1)(2)(3)(4)(5)(6)(7)(8)(9)(10)(11)(12)(13)(14)(15)(16)(17)(18). In this paper, all AUC values presented for SS-GC are the highest value achieved among all scales.…”
Section: (Ii) Node-wise Duffing Oscillatorsmentioning
confidence: 99%
See 1 more Smart Citation
“…The AUC was employed as a performance metric. Since SS-GC is an explicitly multi-scale method [31], SS-GC estimations were repeated at 18 different scales (1)(2)(3)(4)(5)(6)(7)(8)(9)(10)(11)(12)(13)(14)(15)(16)(17)(18). In this paper, all AUC values presented for SS-GC are the highest value achieved among all scales.…”
Section: (Ii) Node-wise Duffing Oscillatorsmentioning
confidence: 99%
“…For example, multi-layer perceptrons [11] or NNs with non-uniform embeddings [12] have been used to introduce nonlinear estimation capabilities which also include 'extended' GC [13], GC for categorical time series [14] and wavelet-based approaches [15]. Also, recent preliminary work has employed deep learning to estimate bivariate GC interactions [16], convolutional NNs to reconstruct temporal causal graphs [17] or recurrent NN (RNN) with a sparsity-inducing penalty term to improve parameter interpretability [18,19]. While RNNs provide flexibility and a generally vast modelling capability, RNN training can prove complex and their employment in real-world data, where data paucity is often an issue, may prove impractical and/or unstable.…”
Section: Introductionmentioning
confidence: 99%
“…The area under the ROC curve (AUC) was employed as a performance metric. Additionally, since SS-GC is an explicitly multi-scale method [28], SS-GC estimations were repeated at 18 different scales (1)(2)(3)(4)(5)(6)(7)(8)(9)(10)(11)(12)(13)(14)(15)(16)(17)(18). In this paper, all AUC values presented for SS-GC are the highest value achieved amongst all scales.…”
Section: Synthetic Validation Of Es-gc and Comparison To Other Estmentioning
confidence: 99%
“…Neural networks like multi-layer perceptions [11] or neural networks with non-uniform embeddings [12] have been used to introduce nonlinear estimation capabilities which also include "extended" GC [13] and waveletbased approaches [14]. Also, recent preliminary work has employed deep learning to estimate bivariate GC interactions [15], convolutional neural networks to reconstruct temporal causal graphs [16] or Recurrent NN (RNN) with a sparsityinducing penalty term to improve parameter interpretability [17], [18]. While RNNs provide flexibility and a generally vast modelling capability, RNN training can prove complex and their employment in real-world data, where data paucity is often an issue, may prove impractical and/or unstable.…”
Section: Introductionmentioning
confidence: 99%
“…Such approaches should have an advantage in terms of representational power and trainability. Some recent examples of neural network-based measures of GC have been proposed, such as neural network GC (NN-GC) ( Montalto et al, 2015 ), RNN-GC ( Wang et al, 2018 ), Echo State Network GC (ES-GC) ( Duggento et al, 2019 ), and DNN-GC ( Chivukula et al, 2018 ). Accordingly, we propose a new approach for whole-brain analytics based on a vector auto-regressive deep neural network (VARDNN) architecture that can deal with a large number of time series.…”
Section: Introductionmentioning
confidence: 99%