2020
DOI: 10.48550/arxiv.2004.13970
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Directed Graph Convolutional Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
35
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(43 citation statements)
references
References 17 publications
0
35
0
Order By: Relevance
“…GraphSAGE [1] aggregates and updates features in a range of the two-hop neighbors to the center node. DGCN [8] considers the first-and second-order proximity to aggregate the attributes on the directed graphs. However, these message aggregators collect information from all neighbors equally, ignoring the relative importance of different neighbor nodes.…”
Section: Graph Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations
“…GraphSAGE [1] aggregates and updates features in a range of the two-hop neighbors to the center node. DGCN [8] considers the first-and second-order proximity to aggregate the attributes on the directed graphs. However, these message aggregators collect information from all neighbors equally, ignoring the relative importance of different neighbor nodes.…”
Section: Graph Neural Networkmentioning
confidence: 99%
“…Graph neural networks (GNNs) offer effective graph-based techniques applied to solve abundant real-world problems in diverse fields, such as social science [1], physical systems [2,3], protein-protein interaction networks [4], brain neuroscience [5], knowledge graphs [6], etc. The power of current GNNs [1,7,8,9,10] is largely due to their message-passing mechanisms. However, the underlying behavior that spontaneously aggregates messages on the graph structure is obscure.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…• General GNNs: GCN [17], ChebyNet [8], GAT [33], Graph-SAGE (Mean aggregation) [14], APPNP [18], and jumping knowledge networks (GCN+JK, GCN+r) [35], where GCN+r only integrates knowledge from the input features. • Digraph GNNs: DGCN [32], DiGCN and DiGCN-IB [31].…”
Section: Baselinesmentioning
confidence: 99%
“…In addition, the edges in a CPS graph often have physical directions to propagating information. That caused "feature mismatching" [10,11] when modelling by normal GCNs, and several methods [12,13,14] have studied directed graph modeling as remedies.…”
Section: Introductionmentioning
confidence: 99%