Proceedings of the 46th Annual Meeting of the Association for Computational Linguistics on Human Language Technologies Short Pa 2008
DOI: 10.3115/1557690.1557731
|View full text |Cite
|
Sign up to set email alerts
|

A linguistically annotated reordering model for BTG-based statistical machine translation

Abstract: In this paper, we propose a linguistically annotated reordering model for BTG-based statistical machine translation. The model incorporates linguistic knowledge to predict orders for both syntactic and non-syntactic phrases. The linguistic knowledge is automatically learned from source-side parse trees through an annotation algorithm. We empirically demonstrate that the proposed model leads to a significant improvement of 1.55% in the BLEU score over the baseline reordering model on the NIST MT-05 Chinese-to-E… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
5
0

Year Published

2009
2009
2014
2014

Publication Types

Select...
2
1

Relationship

3
0

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 4 publications
1
5
0
Order By: Relevance
“…This article significantly extends our previous work (Xiong, Zhang and Li 2011) in four major aspects. Firstly, we thoroughly compare the single backward language model with (1) the single forward language model (FLM) and (2) the combination of the forward and backward language models by integrating them into the decoder at the same time.Secondly, we investigate different trigger pair selection criteria in order to maximize the performance of the MI trigger model.Thirdly, we explore distance-dependent and syntactically informed triggers in order to investigate whether such triggers are able to obtain further improvements.Finally, we validate the robustness of the proposed models on several language pairs, including Chinese-to-English, Spanish-to-English and Vietnamese-to-English.…”
Section: Introductionsupporting
confidence: 84%
“…This article significantly extends our previous work (Xiong, Zhang and Li 2011) in four major aspects. Firstly, we thoroughly compare the single backward language model with (1) the single forward language model (FLM) and (2) the combination of the forward and backward language models by integrating them into the decoder at the same time.Secondly, we investigate different trigger pair selection criteria in order to maximize the performance of the MI trigger model.Thirdly, we explore distance-dependent and syntactically informed triggers in order to investigate whether such triggers are able to obtain further improvements.Finally, we validate the robustness of the proposed models on several language pairs, including Chinese-to-English, Spanish-to-English and Vietnamese-to-English.…”
Section: Introductionsupporting
confidence: 84%
“…In this paper, phrase reordering is recast as a classification issue as done in previous work (Xiong et al, 2006(Xiong et al, & 2008Zhang et al, 2007a). In training, we use a machine learning algorithm training on the annotated phrase reordering instances that are automatically extracted from word-aligned, source sentence parsed training corpus, to learn a classifier.…”
Section: Kernel-based Classifier Solution To Phrase Reorderingmentioning
confidence: 99%
“…For significance test, we use Zhang et al's implementation (Zhang et al, 2004). Baseline Systems: we set three baseline systems: B1) Moses (Koehn et al, 2007) that uses lexicalized unigram reordering model to predict three orientations: monotone, swap and discontinuous; B2) MaxEnt-based reordering model with lexical boundary word features only (Xiong et al, 2006); B3) Linguistically annotated reordering model for BTG-based (LABTG) SMT (Xiong et al, 2008). For Moses, we used the default settings.…”
Section: Tree Kernel Composite Kernel and Integrating Into Our Reordmentioning
confidence: 99%
See 2 more Smart Citations