2016
DOI: 10.1109/tkde.2016.2591009
|View full text |Cite
|
Sign up to set email alerts
|

Scalable Temporal Latent Space Inference for Link Prediction in Dynamic Social Networks

Abstract: Abstract-We propose a temporal latent space model for link prediction in dynamic social networks, where the goal is to predict links over time based on a sequence of previous graph snapshots. The model assumes that each user lies in an unobserved latent space, and interactions are more likely to occur between similar users in the latent space representation. In addition, the model allows each user to gradually move its position in the latent space as the network structure evolves over time. We present a global… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
158
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 237 publications
(159 citation statements)
references
References 42 publications
0
158
0
Order By: Relevance
“…To validate the effectiveness of our E-LSTM-D model, we compare it with node2vec, as a widely used baseline network embedding method, as well as four state-of-the-art DNLP methods that could handle time dependencies, including Temporal Network Embedding (TNE) [41], conditional temporal RBM (ctRBM) [29], Gradient boosting decision tree based Temporal RBM (GTRBM) [42] and Deep Dynamic Network Embedding (DDNE) [31]. In particular, the five baselines are introduced as follows.…”
Section: B Baseline Methodsmentioning
confidence: 99%
“…To validate the effectiveness of our E-LSTM-D model, we compare it with node2vec, as a widely used baseline network embedding method, as well as four state-of-the-art DNLP methods that could handle time dependencies, including Temporal Network Embedding (TNE) [41], conditional temporal RBM (ctRBM) [29], Gradient boosting decision tree based Temporal RBM (GTRBM) [42] and Deep Dynamic Network Embedding (DDNE) [31]. In particular, the five baselines are introduced as follows.…”
Section: B Baseline Methodsmentioning
confidence: 99%
“…Others studied network evolution [24], knowledge graph dynamics [47], and information cascades on Facebook and Twitter [6,23]. Temporal graph behavior has been studied in several directions: some works [22,27,29,58] focused on temporal prediction problems where the input is a graph. They studied numerous deep-learning approaches for representing an entire graph and minimizing a loss function for a specific prediction task.…”
Section: Related Workmentioning
confidence: 99%
“…We vary α from {0.001, 0.01, 0.1}. • TNE [36] is a dynamic network embedding model based on matrix factorization. We set λ with a grid search from {0.01, 0.1, 1}.…”
Section: Baselinesmentioning
confidence: 99%