2021
DOI: 10.1140/epjds/s13688-021-00277-8
|View full text |Cite
|
Sign up to set email alerts
|

Predicting partially observed processes on temporal networks by Dynamics-Aware Node Embeddings (DyANE)

Abstract: Low-dimensional vector representations of network nodes have proven successful to feed graph data to machine learning algorithms and to improve performance across diverse tasks. Most of the embedding techniques, however, have been developed with the goal of achieving dense, low-dimensional encoding of network structure and patterns. Here, we present a node embedding technique aimed at providing low-dimensional feature vectors that are informative of dynamical processes occurring over temporal networks – rather… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(13 citation statements)
references
References 35 publications
0
13
0
Order By: Relevance
“…Additionally, in order to reduce the number of nodes that need to be embedded, only active nodes are considered. These are nodes (i, t) where i had at least one contact at time step t. Inactive nodes are deleted and their incoming edges are rerouted to their next active future self, as previously done by Sato et al 21 . Finally, we converted all 24 datasets into supra-adjacency networks as described.…”
Section: Methodsmentioning
confidence: 99%
“…Additionally, in order to reduce the number of nodes that need to be embedded, only active nodes are considered. These are nodes (i, t) where i had at least one contact at time step t. Inactive nodes are deleted and their incoming edges are rerouted to their next active future self, as previously done by Sato et al 21 . Finally, we converted all 24 datasets into supra-adjacency networks as described.…”
Section: Methodsmentioning
confidence: 99%
“…We compare our approach with several baseline methods from the literature of time-varying graph embeddings, which learn time-stamped node representations: (1) D y ANE [ 8 ], which learns temporal node embeddings with D eep W alk , mapping a time-varying graph into a supra-adjacency representation; (2) D yn GEM [ 36 ], a deep autoencoder architecture which dynamically reconstructs each graph snapshot initializing model weights with parameters learned in previous time frames; (3) D ynamic T riad [ 47 ], which captures structural information and temporal patterns of nodes, modeling the triadic closure process; (4) D y SAT [ 45 ], a deep neural model that computes node embeddings by a joint self-attention mechanism applied on structural neighborhood and temporal dynamics; (5) ISGNS [ 39 ], an incremental skip-gram embedding model based on D eep W alk . Details about hyper-parameters used in each method can be found in Additional file 1 .…”
Section: Methodsmentioning
confidence: 99%
“…We simulated 30 realizations of the SIR process on top of each empirical graph with different combinations of parameters . We used similar combinations of epidemic parameters and the same dynamical process to produce SIR states as described in [ 8 ]. Then we set a logistic regression to classify epidemic states S-I-R assigned to each active node during the unfolding of the spreading process.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations