Proceedings of the Second Workshop on Discourse in Machine Translation 2015
DOI: 10.18653/v1/w15-2508
|View full text |Cite
|
Sign up to set email alerts
|

Part-of-Speech Driven Cross-Lingual Pronoun Prediction with Feed-Forward Neural Networks

Abstract: For some language pairs, pronoun translation is a discourse-driven task which requires information that lies beyond its local context. This motivates the task of predicting the correct pronoun given a source sentence and a target translation, where the translated pronouns have been replaced with placeholders. For crosslingual pronoun prediction, we suggest a neural network-based model using preceding nouns and determiners as features for suggesting antecedent candidates. Our model scores on par with similar mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2015
2015
2017
2017

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(10 citation statements)
references
References 4 publications
0
10
0
Order By: Relevance
“…UU-TIEDEMANN (Tiedemann, 2015) used a linear support vector machine with local features and simple surface features derived from preceding noun phrases. WHATELLES (Callin et al, 2015) used a neural network classifier based on work by Hardmeier et al (2013b), but replacing all (explicit or latent) anaphora resolution with information extracted from preceding noun phrases. The IDIAP system (Luong et al, 2015) used a Naïve Bayes classifier and extracted features from both preceding and following noun phrases to account for the possibility of cataphoric references.…”
Section: Submitted Systemsmentioning
confidence: 99%
“…UU-TIEDEMANN (Tiedemann, 2015) used a linear support vector machine with local features and simple surface features derived from preceding noun phrases. WHATELLES (Callin et al, 2015) used a neural network classifier based on work by Hardmeier et al (2013b), but replacing all (explicit or latent) anaphora resolution with information extracted from preceding noun phrases. The IDIAP system (Luong et al, 2015) used a Naïve Bayes classifier and extracted features from both preceding and following noun phrases to account for the possibility of cataphoric references.…”
Section: Submitted Systemsmentioning
confidence: 99%
“…Some of the best systems developed for these tasks avoided, in fact, the direct use of anaphora resolution (with the exception of Luong et al (2015)). For example, Callin et al (2015) designed a classifier based on a feed-forward neural network, which considered as features the preceding nouns and determiners along with their part-of-speech tags. The winning systems of the 2016 task used deep neural networks: Luotolahti et al (2016) and Dabre et al (2016) summarized the preceding and following contexts of the pronoun to predict and passed them to a recurrent neural network.…”
Section: Coreference-aware Machine Translationmentioning
confidence: 99%
“…Several systems developed for this task avoid direct use of anaphora resolution, but still reach competitive performance. Callin et al (2015) designed a classifier based on a feed-forward neural network, which considered as features the preceding nouns and determiners along with their partsof-speech. Stymne (2016) combined the local context surrounding the source and target pronouns (lemmas and POS tags) together with source-side dependency heads.…”
Section: Related Workmentioning
confidence: 99%