“…Recently, many research efforts and the key contributions have been made to address the limitations of Seq2Seq when dealing with more complex data, that leverage external information using specialized neural models attached to underlying targeted applications, including Tree2Seq (Eriguchi et al, 2016), Set2Seq (Vinyals et al, 2015a), Recursive Neural Networks (Socher et al, 2010), and Tree-Structured LSTM (Tai et al, 2015). Due to more recent advances in graph representations and graph convolutional networks, a number of research has investigated to utilize various GNN to improve the performance over the Seq2Seq models in the domains of machine translation and graph generation (Bastings et al, 2017;Beck et al, 2018;Simonovsky & Komodakis, 2018;Li et al, 2018).…”