Proceedings of the 14th SIGMORPHON Workshop on Computational Research in Phonetics, Phonology, and Morphology 2016
DOI: 10.18653/v1/w16-2005
|View full text |Cite
|
Sign up to set email alerts
|

Morphological Reinflection via Discriminative String Transduction

Abstract: We describe our approach and experiments in the context of the SIGMOR-PHON 2016 Shared Task on Morphological Reinflection. The results show that the methods of Nicolai et al. (2015) perform well on typologically diverse languages. We also discuss language-specific heuristics and errors.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
19
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 15 publications
(19 citation statements)
references
References 6 publications
(7 reference statements)
0
19
0
Order By: Relevance
“…For the SIGMORPHON 2016 and the CoNLL-SIGMORPHON 2017 shared tasks (Cotterell et al, , 2017, multiple MRI systems were developed, e.g., (Nicolai et al, 2016;Taji et al, 2016;Kann and Schütze, 2016;Östling, 2016;Makarov et al, 2017). Encoder-decoder neural networks (Cho et al, 2014a; performed best, such that we extend them in this work.…”
Section: Related Workmentioning
confidence: 93%
See 1 more Smart Citation
“…For the SIGMORPHON 2016 and the CoNLL-SIGMORPHON 2017 shared tasks (Cotterell et al, , 2017, multiple MRI systems were developed, e.g., (Nicolai et al, 2016;Taji et al, 2016;Kann and Schütze, 2016;Östling, 2016;Makarov et al, 2017). Encoder-decoder neural networks (Cho et al, 2014a; performed best, such that we extend them in this work.…”
Section: Related Workmentioning
confidence: 93%
“…This task was addressed through several approaches, including align and transduce (Alegria and Etxeberria, 2016;Nicolai et al, 2016;Liu and Mao, 2016); recurrent neural networks (Kann and Schütze, 2016;Östling, 2016); and, linguisticinspired heuristics approaches (Taji et al, 2016;Sorokin, 2016). Overall, recurrent neural networks approaches performed better, being (Kann and Schütze, 2016) the best performing system in the shared task, obtaining around 98%.…”
Section: Related Workmentioning
confidence: 99%
“…This task was addressed through several approaches, including align and transduce (Alegria and Etxeberria, 2016;Nicolai et al, 2016;Liu and Mao, 2016); recurrent neural networks (Kann and Schütze, 2016;Aharoni et al, 2016;Ostling, 2016); and linguistic-inspired heuristics approaches (Taji et al, 2016;Sorokin, 2016). Overall, recurrent neural networks approaches performed better, being (Kann and Schütze, 2016) the best performing system in the shared task, obtaining around 98%.…”
Section: Related Workmentioning
confidence: 99%
“…In the small training data scenario, it is not practical to treat tag sequences as atomic units, as we did in Nicolai et al (2016), because many tag sequences may be represented by only a single training instance, or not at all. We follow in separating each tag sequence into its component subtags, in order to share information across inflection slots.…”
Section: Tag Splittingmentioning
confidence: 99%
“…For example, Liu and Mao (2016) and King (2016) use Conditional Random Fields (CRF) and Alegria and Etxeberria (2016) and Nicolai et al (2016) employ different phoneme-tographeme translation systems. Other approaches include learning a morphological analyzer from training data and applying it to reinflect test examples (Taji et al, 2016) and extracting morphological paradigms from the training data which are then applied on test words Sorokin, 2016).…”
Section: Related Workmentioning
confidence: 99%