2018
DOI: 10.48550/arxiv.1804.00823
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks

Kun Xu,
Lingfei Wu,
Zhiguo Wang
et al.

Abstract: The celebrated Sequence to Sequence learning (Seq2Seq) technique and its numerous variants achieve excellent performance on many tasks. However, many machine learning tasks have inputs naturally represented as graphs; existing Seq2Seq models face a significant challenge in achieving accurate conversion from graph form to the appropriate sequence. To address this challenge, we introduce a novel general end-to-end graph-to-sequence neural encoder-decoder model that maps an input graph to a sequence of vectors an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
76
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 50 publications
(76 citation statements)
references
References 29 publications
0
76
0
Order By: Relevance
“…GNN-based Methods. Recent years have seen a surge of interests in Graph Neural Networks [2,3,6,9,30] and as a result various GNN methods have been utilized for improving recommendation system. Several GNN-based models have been proposed to learn item representations for the session-based recommendation [1,15,[25][26][27]29].…”
Section: Related Workmentioning
confidence: 99%
“…GNN-based Methods. Recent years have seen a surge of interests in Graph Neural Networks [2,3,6,9,30] and as a result various GNN methods have been utilized for improving recommendation system. Several GNN-based models have been proposed to learn item representations for the session-based recommendation [1,15,[25][26][27]29].…”
Section: Related Workmentioning
confidence: 99%
“…Graph Neural Networks. Recently, GNNs (Hamilton, Ying, and Leskovec 2017;Li et al 2015;Kipf and Welling 2016;Xu et al 2018) have become a hot research topic since their strengths in learning structure data. Various applications in different domains such as chemistry biology (Duvenaud et al 2015), computer vision (Norcliffe-Brown, Vafeias, andParisot 2018), natural language processing (Xu et al 2018;Chen, Wu, and Zaki 2019b) have demonstrated the effectiveness of GNNs.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, GNNs (Hamilton, Ying, and Leskovec 2017;Li et al 2015;Kipf and Welling 2016;Xu et al 2018) have become a hot research topic since their strengths in learning structure data. Various applications in different domains such as chemistry biology (Duvenaud et al 2015), computer vision (Norcliffe-Brown, Vafeias, andParisot 2018), natural language processing (Xu et al 2018;Chen, Wu, and Zaki 2019b) have demonstrated the effectiveness of GNNs. In program scenario, compared with the early works to represent programs with abstract syntax tree (Alon et al 2018(Alon et al , 2019Liu et al 2020b), more works have already attempted to use graphs (Allamanis, Brockschmidt, and Khademi 2017) to learn the semantics for various applications, e.g., source code summarization (Liu et al 2020a;Fernandes, Allamanis, and Brockschmidt 2018), vulnerability detection (Zhou et al 2019), type inference (Allamanis et al 2020).…”
Section: Related Workmentioning
confidence: 99%
“…Song et al [50] explored the effectiveness of AMR as a semantic representation for neural machine translation based on a graph recurrent network. Previous studies [12,21,60] applied a bidirectional graph encoder to encode an input graph and then used an attention-based LSTM [27] decoder. To capture longer-range dependencies, Song et al [51] employed an LSTM to process the state transitions of the graph encoder outputs.…”
Section: Graph Neural Networkmentioning
confidence: 99%