2010
DOI: 10.1007/s10590-010-9080-7
|View full text |Cite
|
Sign up to set email alerts
|

Monte Carlo techniques for phrase-based translation

Abstract: Abstract. Recent advances in statistical machine translation have used approximate beam search for NP-complete inference within probabilistic translation models. We present an alternative approach of sampling from the posterior distribution defined by a translation model. We define a novel Gibbs sampler for sampling translations given a source sentence and show that it effectively explores this posterior distribution. In doing so we overcome the limitations of heuristic beam search and obtain theoretically sou… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2011
2011
2014
2014

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 19 publications
(16 reference statements)
0
3
0
Order By: Relevance
“…Langlais et al (2007) described a greedy search decoder, first introduced in (Germann et al, 2001), able to improve translations produced by a dynamic programming decoder using the same scoring function and translation table. However, the more recent work by Arun et al (2010) using a Gibbs sampler for approximating maximum translation decoding showed the adequacy of the approxima-tions made by state-of-the-art decoders for finding the best translation in their search space. Other works were more directly targeted at automatic post-editing of SMT output, and approached the problem as one of second-pass translation between automatic predictions and correct translations (Simard et al, 2007;Dugast et al, 2007).…”
Section: Related Workmentioning
confidence: 99%
“…Langlais et al (2007) described a greedy search decoder, first introduced in (Germann et al, 2001), able to improve translations produced by a dynamic programming decoder using the same scoring function and translation table. However, the more recent work by Arun et al (2010) using a Gibbs sampler for approximating maximum translation decoding showed the adequacy of the approxima-tions made by state-of-the-art decoders for finding the best translation in their search space. Other works were more directly targeted at automatic post-editing of SMT output, and approached the problem as one of second-pass translation between automatic predictions and correct translations (Simard et al, 2007;Dugast et al, 2007).…”
Section: Related Workmentioning
confidence: 99%
“…[16] use a Monte Carlo technique for decoding phrase based translation lattices. Bi-directional learning uses a Perceptron to iteratively label each position of a word sequence, updating parameters when a single incorrect label is applied [17].…”
Section: Discriminative Hill Climbing Trainingmentioning
confidence: 99%
“…Alternatively, several works have extended the model of Koehn et al (2003) by, instead of performing workarounds to its limitations, replacing the beam search algorithm altogether. For instance, Arun et al (2010) use a Gibbs sampler to draw samples from the posterior distribution. The sampler consists of three operators that, applied probabilistically, explore the distribution.…”
Section: Document-level Statistical Machine Translationmentioning
confidence: 99%