Proceedings of the 10th International Conference on Natural Language Generation 2017
DOI: 10.18653/v1/w17-3531
|View full text |Cite
|
Sign up to set email alerts
|

A Comparison of Neural Models for Word Ordering

Abstract: We compare several language models for the word-ordering task and propose a new bagto-sequence neural model based on attentionbased sequence-to-sequence models. We evaluate the model on a large German WMT data set where it significantly outperforms existing models. We also describe a novel search strategy for LM-based word ordering and report results on the English Penn Treebank. Our best model setup outperforms prior work both in terms of speed and quality.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
30
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(30 citation statements)
references
References 12 publications
0
30
0
Order By: Relevance
“…In the case of the Decoder, we used two layers and the attention mechanism proposed by Bahdanau et al (2014) in order to consider all words in the contexts (due to the unordered words). This proposal was similar to the recurrent neural network language model proposed in (Hasler et al, 2017).…”
Section: Word Orderingmentioning
confidence: 77%
See 2 more Smart Citations
“…In the case of the Decoder, we used two layers and the attention mechanism proposed by Bahdanau et al (2014) in order to consider all words in the contexts (due to the unordered words). This proposal was similar to the recurrent neural network language model proposed in (Hasler et al, 2017).…”
Section: Word Orderingmentioning
confidence: 77%
“…Our proposal was motivated by the works of (Hasler et al, 2017) and (Zhang and Clark, 2015). Thus, we tackled the problem by applying a syntax-based word ordering strategy using a sequence-to-sequence model (seq-2-seq).…”
Section: System Descriptionmentioning
confidence: 99%
See 1 more Smart Citation
“…One way to achieve this goal would be to reconstruct readable documents from the bag-of-words output that our mechanism currently provides. A range of promising techniques for reconstructing readable texts from bag-of-words have already produced some good experimental results [19,52,54]. In future work we aim to explore how techniques such as these could be applied as a final post processing step for our mechanism.…”
Section: Discussionmentioning
confidence: 99%
“…Evaluation Metrics: Linearisation tasks are generally reported using BLEU (Papineni et al, 2002) score (Hasler et al, 2017;Belz et al, 2011). Additionally, we report Kendall's Tau (τ ) and perfect match scores for the models.…”
Section: Methodsmentioning
confidence: 99%