2016
DOI: 10.48550/arxiv.1603.06075
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Tree-to-Sequence Attentional Neural Machine Translation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 28 publications
(14 citation statements)
references
References 13 publications
0
14
0
Order By: Relevance
“…Recently, many research efforts and the key contributions have been made to address the limitations of Seq2Seq when dealing with more complex data, that leverage external information using specialized neural models attached to underlying targeted applications, including Tree2Seq (Eriguchi et al, 2016), Set2Seq (Vinyals et al, 2015a), Recursive Neural Networks (Socher et al, 2010), and Tree-Structured LSTM (Tai et al, 2015). Due to more recent advances in graph representations and graph convolutional networks, a number of research has investigated to utilize various GNN to improve the performance over the Seq2Seq models in the domains of machine translation and graph generation (Bastings et al, 2017;Beck et al, 2018;Simonovsky & Komodakis, 2018;Li et al, 2018).…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, many research efforts and the key contributions have been made to address the limitations of Seq2Seq when dealing with more complex data, that leverage external information using specialized neural models attached to underlying targeted applications, including Tree2Seq (Eriguchi et al, 2016), Set2Seq (Vinyals et al, 2015a), Recursive Neural Networks (Socher et al, 2010), and Tree-Structured LSTM (Tai et al, 2015). Due to more recent advances in graph representations and graph convolutional networks, a number of research has investigated to utilize various GNN to improve the performance over the Seq2Seq models in the domains of machine translation and graph generation (Bastings et al, 2017;Beck et al, 2018;Simonovsky & Komodakis, 2018;Li et al, 2018).…”
Section: Related Workmentioning
confidence: 99%
“…However, the Seq2Seq model often fails to perform as well as hoped on these problems, in part because it inevitably suffers significant information loss due to the conversion of complex structured data into a sequence, especially when the input data is naturally represented as graphs. Recently, a line of research efforts have been devoted to incorporate additional information by extracting syntactic information such as the phrase structure of a source sentence (Tree2seq) (Eriguchi et al, 2016), by utilizing attention mechanisms for input sets (Set2seq) (Vinyals et al, 2015a), and by encoding sentences recursively as trees (Socher et al, 2010;Tai et al, 2015). Although these methods achieve promising results on certain classes of problems, most of the presented techniques largely depend on the underlying application and may not be able to generalize to a broad class of problems in a general way.…”
Section: Introductionmentioning
confidence: 99%
“…The neural encoder-decoder models [17], [22] have been widely extended to model the mapping of general object inputs to their corresponding sequences [23], [24]. Recent advances in graph deep learning and graph convolutional networks have enabled various graph deep learning models to handle challenges in the domains of graph generation [25]- [27] and graph-to-sequence learning [28].…”
Section: Neural Encoder-decoder Modelsmentioning
confidence: 99%
“…In [37], a tree-structured recursive neural network is applied for visual-textual sentiment analysis. The tree structures have also been utilized for machine translation in [8] and for object localization in [14]. Our model differs from them in the following aspects: We introduce a tree-structured memory network which contains the heterogeneous nodes.…”
Section: Tree Networkmentioning
confidence: 99%