2018
DOI: 10.48550/arxiv.1810.03975
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Towards Two-Dimensional Sequence to Sequence Model in Neural Machine Translation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(12 citation statements)
references
References 0 publications
0
12
0
Order By: Relevance
“…As given in Fig. 5, 2D LSTM network can be realized when the LSTM cell is recursively in a 2D mesh form [13]. Each LSTM cell utilizes the hidden and cell states from the two neighboring cells in the left and below positions in the mesh.…”
Section: B Framework Architecture and Methodsmentioning
confidence: 99%
“…As given in Fig. 5, 2D LSTM network can be realized when the LSTM cell is recursively in a 2D mesh form [13]. Each LSTM cell utilizes the hidden and cell states from the two neighboring cells in the left and below positions in the mesh.…”
Section: B Framework Architecture and Methodsmentioning
confidence: 99%
“…1: The internal architecture of the standard and the 2DLSTM. The additional connections are marked in blue [13]. matic extraction of features from raw 2D-images over convolutional neural networks (CNNs) [14].…”
Section: Related Workmentioning
confidence: 99%
“…Recently, the 2DLSTM layer also has been used for sequence-to-sequence modeling in machine translation [13] where it implicitly updates the source representation conditioned on the generated target words. In a similar direction, a 2D CNN-based network has been proposed where the positions of the source and the target words define the 2D grid for translation modeling [21].…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations