Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence 2019
DOI: 10.24963/ijcai.2019/640
|View full text |Cite
|
Sign up to set email alerts
|

Node Embedding over Temporal Graphs

Abstract: In this work, we present a method for node embedding in temporal graphs. We propose an algorithm that learns the evolution of a temporal graph's nodes and edges over time and incorporates this dynamics in a temporal node embedding framework for different graph prediction tasks. We present a joint loss function that creates a temporal embedding of a node by learning to combine its historical temporal embeddings, such that it optimizes per given task (e.g., link prediction). The algorithm is initialized using st… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
78
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 115 publications
(88 citation statements)
references
References 38 publications
0
78
0
Order By: Relevance
“…CTDNE [31] is a temporal network embedding algorithm based on random walks. We also choose tNodeEmbed [41] as a baseline model, which uses a different learning model from the above models. It learns temporal network embedding by combining static node embeddings learned from Node2Vec with Recurrent Neural Networks.…”
Section: Baseline Algorithmsmentioning
confidence: 99%
“…CTDNE [31] is a temporal network embedding algorithm based on random walks. We also choose tNodeEmbed [41] as a baseline model, which uses a different learning model from the above models. It learns temporal network embedding by combining static node embeddings learned from Node2Vec with Recurrent Neural Networks.…”
Section: Baseline Algorithmsmentioning
confidence: 99%
“…This can be employed for different underlying applications such as link prediction and node classification [13]. Various works such as Temporal tensorfactorization [11,34], and neural embeddings [14,27,35] have been proposed in this respect. For instance, Singer et al [27] extend the prior neural-based embedding approaches on static graphs, e.g., node2vec [15], to temporal graphs.…”
Section: Temporal Latent Space Modelingmentioning
confidence: 99%
“…Various works such as Temporal tensorfactorization [11,34], and neural embeddings [14,27,35] have been proposed in this respect. For instance, Singer et al [27] extend the prior neural-based embedding approaches on static graphs, e.g., node2vec [15], to temporal graphs. They propose a semi-supervised algorithm, namely tNodeEmbed, that learns to combine a node's historical temporal embeddings into a final embedding such that it can optimize for a given underlying task, e.g., link prediction.…”
Section: Temporal Latent Space Modelingmentioning
confidence: 99%
See 1 more Smart Citation
“…At the base of many methods there is the modification of the standard representation of the temporal network, whether it is in the form of a list of events, a tensor [9] or a supra-adjacency matrix [30]. All of these methods [31,32,33] commonly aim to solve a node embedding problem by locally sampling the temporal-structural neighbourhood of nodes to create contexts, which they feed to a Skip-Gram learning architecture borrowed from the text representation literature [34]. As a solution, they build a sequence of correlated/updated embeddings of network snapshots, which consider short term history of the network backward in time.…”
Section: Introductionmentioning
confidence: 99%