Proceedings of the First Conference on Machine Translation: Volume 1, Research Papers 2016
DOI: 10.18653/v1/w16-2208
|View full text |Cite
|
Sign up to set email alerts
|

Using Factored Word Representation in Neural Network Language Models

Abstract: Neural network language and translation models have recently shown their great potentials in improving the performance of phrase-based machine translation. At the same time, word representations using different word factors have been translation quality and are part of many state-of-theart machine translation systems. used in many state-of-the-art machine translation systems, in order to support better translation quality. In this work, we combined these two ideas by investigating the combination of both techn… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(10 citation statements)
references
References 16 publications
0
10
0
Order By: Relevance
“…Applying more tightly coupled linguistic factors on the target for NMT has been previously investigated. Niehues et al (2016) proposed a factored RNN-based language model for re-scoring an n-best list produced by a phrase-based MT system. In recent work, Martínez et al (2016) implemented a factored NMT decoder which generated both lemmas and morphological tags.…”
Section: Related Workmentioning
confidence: 99%
“…Applying more tightly coupled linguistic factors on the target for NMT has been previously investigated. Niehues et al (2016) proposed a factored RNN-based language model for re-scoring an n-best list produced by a phrase-based MT system. In recent work, Martínez et al (2016) implemented a factored NMT decoder which generated both lemmas and morphological tags.…”
Section: Related Workmentioning
confidence: 99%
“…2016a), and more in line with our proposed approach, pre-translation (Niehues et al . 2016). Recently, transformer-based models have outperformed the attention-based GRU models on the various benchmark datasets and have become the state-of-the-art technique in NMT.…”
Section: Results and Analysismentioning
confidence: 99%
“…Factored word representations have also been considered in neural language models (Niehues et al, 2016;Alexandrescu and Kirchhoff, 2006;Wu et al, 2012), and more recently in a neural machine translation architecture as input features (Sennrich and Haddow, 2016) and in the output by separating the lemma and morphological factors (García-Martínez et al, 2016). One contribution of the current paper is the investigation of new variants of the latter architecture.…”
Section: Related Workmentioning
confidence: 99%