2019
DOI: 10.1145/3363574
|View full text |Cite
|
Sign up to set email alerts
|

Attention Models in Graphs

Abstract: Graph-structured data arise naturally in many different application domains. By representing data as graphs, we can capture entities (i.e., nodes) as well as their relationships (i.e., edges) with each other. Many useful insights can be derived from graph-structured data as demonstrated by an ever-growing body of work focused on graph mining. However, in the real-world, graphs can be both large -with many complex patterns -and noisy which can pose a problem for effective graph mining. An effective way to deal … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
100
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
4
1

Relationship

1
9

Authors

Journals

citations
Cited by 211 publications
(101 citation statements)
references
References 58 publications
1
100
0
Order By: Relevance
“…for learning from relational data, reviewing part of GNNs under a unified framework. Lee et al [12] conduct a partial survey of GNNs which apply different attention mechanisms. In summary, existing surveys only include some of the GNNs and examine a limited number of works, thereby missing the most recent development of GNNs.…”
Section: Introductionmentioning
confidence: 99%
“…for learning from relational data, reviewing part of GNNs under a unified framework. Lee et al [12] conduct a partial survey of GNNs which apply different attention mechanisms. In summary, existing surveys only include some of the GNNs and examine a limited number of works, thereby missing the most recent development of GNNs.…”
Section: Introductionmentioning
confidence: 99%
“…Graph Attention Networks. With recent advancements in graph neural networks [43,46,47] and attention mechanisms [18,39], GAT [40] introduces the attention mechanism in network feature aggregation by implicitly prioritizing node neighbors. Several works attempt to extend GAT to dynamic versions: combining recurrent neural networks with GAT [34]; combined attention on structural neighborhood and temporal dynamics [29] and nodeaware attention for user interaction predictions in real-world dynamic graphs [26].…”
Section: Related Workmentioning
confidence: 99%
“…Conventional architectures like recurrent-and convolutional neural networks are based on the assumption that there is repetitive structure in the input. In contrast, GNNs cannot be based on that assumption because graph data is more irregular in nature, see [32] and [33] for an overview of the field. Several concepts from the image recognition and NLP fields have been adapted to GNNs, like graph convolution [34], graph attention [35] and graph embeddings [36].…”
Section: Mapping To Natural Language Processingmentioning
confidence: 99%