2021 IEEE International Conference on Big Knowledge (ICBK) 2021
DOI: 10.1109/ickg52313.2021.00059
|View full text |Cite
|
Sign up to set email alerts
|

Learning Dynamic Preference Structure Embedding From Temporal Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 19 publications
0
2
0
Order By: Relevance
“…Formally, a graph consisting of n nodes can be represented as G = (A, X), where A ∈ R n×n is adjacency matrix and X ∈ R n×d is the node feature matrix. To efficiently aggregate node features with adjacency information, graph neural networks (GNNs) [12]- [17] are developed to learn more powerful representations by considering topological structure information. Given a graph consisting of n nodes, GNNs generally follow a message-passing architecture:…”
Section: A Graph Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…Formally, a graph consisting of n nodes can be represented as G = (A, X), where A ∈ R n×n is adjacency matrix and X ∈ R n×d is the node feature matrix. To efficiently aggregate node features with adjacency information, graph neural networks (GNNs) [12]- [17] are developed to learn more powerful representations by considering topological structure information. Given a graph consisting of n nodes, GNNs generally follow a message-passing architecture:…”
Section: A Graph Neural Networkmentioning
confidence: 99%
“…The propagation function M can be implemented in various manners [12]- [17]. The recursive neighborhood aggregation scheme enables the node features to absorb the structure information.…”
Section: A Graph Neural Networkmentioning
confidence: 99%