2020
DOI: 10.48550/arxiv.2010.05234
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Practical Tutorial on Graph Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…Types of encoder-decoder architectures, combined with an attention mechanism [604], are furthermore naturally suited for modelling time series [605] and sequential data, offering state of the art performance in the space of natural-language processing. Finally, the development of graph neural networks further accelerated progress in developing general AI architectures for handling unstructured and non-Euclidean data [606].…”
Section: Deep Learningmentioning
confidence: 99%

Social physics

Jusup,
Holme,
Kanazawa
et al. 2021
Preprint
“…Types of encoder-decoder architectures, combined with an attention mechanism [604], are furthermore naturally suited for modelling time series [605] and sequential data, offering state of the art performance in the space of natural-language processing. Finally, the development of graph neural networks further accelerated progress in developing general AI architectures for handling unstructured and non-Euclidean data [606].…”
Section: Deep Learningmentioning
confidence: 99%

Social physics

Jusup,
Holme,
Kanazawa
et al. 2021
Preprint