2010
DOI: 10.1162/coli_a_00006
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical Phrase-Based Translation with Weighted Finite-State Transducers and Shallow-n Grammars

Abstract: In this article we describe HiFST, a lattice-based decoder for hierarchical phrase-based translation and alignment. The decoder is implemented with standard Weighted Finite-State Transducer (WFST) operations as an alternative to the well-known cube pruning procedure. We find that the use of WFSTs rather than k-best lists requires less pruning in translation search, resulting in fewer search errors, better parameter optimization, and improved translation performance. The direct generation of translation lattice… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
39
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
3
3
3

Relationship

2
7

Authors

Journals

citations
Cited by 29 publications
(39 citation statements)
references
References 22 publications
0
39
0
Order By: Relevance
“…The choice of the hypergraph representation is merely one of several alternatives. For example, we could have adopted a representation based on weighted finite state transducers (de Gispert, Iglesias, Blackwood, Banga, & Byrne, 2010) since our model describes a regular language both in terms of the PCFG and the surface level models we intersect it with. It is also possible to represent our grammar as a pushdown automaton (Iglesias, Allauzen, Byrne, de Gispert, & Riley, 2011) and intersect it with finite automata representing a language model and dependency-related information, respectively.…”
Section: Learningmentioning
confidence: 99%
“…The choice of the hypergraph representation is merely one of several alternatives. For example, we could have adopted a representation based on weighted finite state transducers (de Gispert, Iglesias, Blackwood, Banga, & Byrne, 2010) since our model describes a regular language both in terms of the PCFG and the surface level models we intersect it with. It is also possible to represent our grammar as a pushdown automaton (Iglesias, Allauzen, Byrne, de Gispert, & Riley, 2011) and intersect it with finite automata representing a language model and dependency-related information, respectively.…”
Section: Learningmentioning
confidence: 99%
“…In this section we describe HiFST [47,81], a hierarchical phrase-based decoder that uses a WFST. The HiFST decoder produces word lattices and applies the full language model using WFST operations.…”
Section: Hifstmentioning
confidence: 99%
“…In this section we describe the HiFST decoder [47,79,81]. As stated in Section 2.3.1 the CYK algorithm produces a large number of derivations that share sub-derivations.…”
Section: Hierarchical Phrase-based Decoding With Wfstsmentioning
confidence: 99%
“…Closer to our context, (de Gispert et al, 2010) propose to use Finite-State Transducers in the context of Hierarchical Phrase Based Translation. Their method is to iteratively construct and minimize the full "top-level lattice" representing the whole set of translations bottom-up.…”
Section: Related Workmentioning
confidence: 99%