Proceedings of the 14th Conference of the European Chapter of the Association for Computational Linguistics 2014
DOI: 10.3115/v1/e14-1026
|View full text |Cite
|
Sign up to set email alerts
|

Source-side Preordering for Translation using Logistic Regression and Depth-first Branch-and-Bound Search

Abstract: We present a simple preordering approach for machine translation based on a featurerich logistic regression model to predict whether two children of the same node in the source-side parse tree should be swapped or not. Given the pair-wise children regression scores we conduct an efficient depth-first branch-and-bound search through the space of possible children permutations, avoiding using a cascade of classifiers or limiting the list of possible ordering outcomes. We report experiments in translating English… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
22
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(22 citation statements)
references
References 15 publications
0
22
0
Order By: Relevance
“…Preordering reorders source sentences before translation, while post-ordering reorders sentences translated without considering the word order after translation. In particular, preordering effectively improves the translation quality because it solves long-distance reordering and computational complexity issues (Jehl et al, 2014;Nakagawa, 2015).…”
Section: Introductionmentioning
confidence: 99%
“…Preordering reorders source sentences before translation, while post-ordering reorders sentences translated without considering the word order after translation. In particular, preordering effectively improves the translation quality because it solves long-distance reordering and computational complexity issues (Jehl et al, 2014;Nakagawa, 2015).…”
Section: Introductionmentioning
confidence: 99%
“…For the task of translation from Chinese to Japanese, Sudoh et al [8] used a learning-to-rank model based on a pairwise classification method to predict the target Japanese word order. In the same spirit, Jehl et al [9] proposed a featurebased reordering model for English-to-Japanese and English-to-Korean translation. Their model predicts whether a pair of sibling nodes on the source-side of the parse tree needs to be swapped.…”
Section: Related Workmentioning
confidence: 99%
“…As we are interested in the bilingual case and, specifically, in preordering, we content ourselves with using the same syntactic representation, i.e. dependency trees, that many preordering models use (e.g., Jehl et al (2014), Lerner and Petrov (2013)). …”
Section: Bilingual Head Direction Entropymentioning
confidence: 99%
“…More recently, Jehl et al (2014) learn to order sibling nodes in the source-side dependency parse tree. The space of possible permutations is explored via depth-first branch-and-bound search (Balas and Toth, 1983).…”
Section: Source Syntax-based Preorderingmentioning
confidence: 99%
See 1 more Smart Citation