2021
DOI: 10.1007/s40747-021-00332-x
|View full text |Cite
|
Sign up to set email alerts
|

Temporal network embedding using graph attention network

Abstract: Graph convolutional network (GCN) has made remarkable progress in learning good representations from graph-structured data. The layer-wise propagation rule of conventional GCN is designed in such a way that the feature aggregation at each node depends on the features of the one-hop neighbouring nodes. Adding an attention layer over the GCN can allow the network to provide different importance within various one-hop neighbours. These methods can capture the properties of static network, but is not well suited t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 50 publications
0
7
0
Order By: Relevance
“…Finally, the studies by Yang et al ( 2022a ) proposed a hyperbolic temporal graph network on the Poincaré ball model of hyperbolic space; the temporal network embedding using graph attention network by Mohan and Pramod ( 2021 ); the embedding based on a variational autoencoder able to capture the evolution of temporal networks by Jiao et al ( 2022 ), ConMNCI by Liu et al ( 2022 ) that inductively mines local and communal influences. The authors of ConMNCI suggested an aggregator function that combines local and global influences to produce node embeddings at any time and presented the concept of continuous learning to strengthen inductive learning; the continuous-time dynamic network embeddings by Nguyen et al ( 2018 ), the causal anonymous walk representations for temporal network embedding by Makarov et al ( 2022 ); and TempNodeEmb by Abbas et al ( 2023 ).…”
Section: Graph Embedding Algorithmsmentioning
confidence: 99%
“…Finally, the studies by Yang et al ( 2022a ) proposed a hyperbolic temporal graph network on the Poincaré ball model of hyperbolic space; the temporal network embedding using graph attention network by Mohan and Pramod ( 2021 ); the embedding based on a variational autoencoder able to capture the evolution of temporal networks by Jiao et al ( 2022 ), ConMNCI by Liu et al ( 2022 ) that inductively mines local and communal influences. The authors of ConMNCI suggested an aggregator function that combines local and global influences to produce node embeddings at any time and presented the concept of continuous learning to strengthen inductive learning; the continuous-time dynamic network embeddings by Nguyen et al ( 2018 ), the causal anonymous walk representations for temporal network embedding by Makarov et al ( 2022 ); and TempNodeEmb by Abbas et al ( 2023 ).…”
Section: Graph Embedding Algorithmsmentioning
confidence: 99%
“…Some spatial methods focus on improving model capacity by introducing an attention mechanism to the graph domain, such as the Graph attention network (GAT), which adopts a self-attention mechanism to learn the weighting function [4]. Developments of GAT, such as Dual-primal graph convolutional network (DPGCN) [22] generalized GAT by using convolutions on nodes and edges, giving a better performance, Temporal graph attention network (TempGAN) learns node representations from continuous-time temporal graphs [23], and Hyperbolic graph attention network learns robust node representations of graphs in hyperbolic spaces [24]. The graph sample and aggregate method (GraphSage) [18], a nodebased spatial method, learns node, rather than graph, embeddings so it is graph-scale free and can be applied to large or evolving graphs.…”
Section: Related Workmentioning
confidence: 99%
“…Wang et al [31] proposed a novel heterogeneous graph neural network based on the hierarchical attention, including node-level and semantic-level attentions. Mohan and Pramod [32] proposed a temporal graph attention network (TempGAN), whose aim is to learn representations from continuous-time temporal network by preserving the temporal proximity between nodes of the network.…”
Section: Graph Attention Network Recommendationmentioning
confidence: 99%