2010
DOI: 10.1007/978-3-642-15754-7_77
|View full text |Cite
|
Sign up to set email alerts
|

Clustering Morphological Paradigms Using Syntactic Categories

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2010
2010
2020
2020

Publication Types

Select...
3
3

Relationship

5
1

Authors

Journals

citations
Cited by 9 publications
(9 citation statements)
references
References 10 publications
0
9
0
Order By: Relevance
“…Part of speech information can also be useful, and algorithms have been developed for induction of distributional word classes through sentential contexts (Schütze 1993;Redington et al 1998). There have been efforts to combine morphology induction and distributional induction of word classes (Parkes et al 1998;Wicentowski 2002;Clark 2003;Higgins 2003;Freitag 2004Freitag , 2005Hu et al 2005b;Biemann 2006;Dasgupta and Ng 2007;Can and Manandhar 2009). …”
Section: Unsupervised Induction Of Morphology and Part Of Speech Classesmentioning
confidence: 99%
“…Part of speech information can also be useful, and algorithms have been developed for induction of distributional word classes through sentential contexts (Schütze 1993;Redington et al 1998). There have been efforts to combine morphology induction and distributional induction of word classes (Parkes et al 1998;Wicentowski 2002;Clark 2003;Higgins 2003;Freitag 2004Freitag , 2005Hu et al 2005b;Biemann 2006;Dasgupta and Ng 2007;Can and Manandhar 2009). …”
Section: Unsupervised Induction Of Morphology and Part Of Speech Classesmentioning
confidence: 99%
“…The system outperforms other unsupervised morphological segmentation systems that competed in Morpho Challenge 2009 (Kurimo et al 2009) for the languages Finnish, Turkish, and German. Can and Manandhar (2010) exploit syntactic categories to capture morphological paradigms. In a deterministic scheme, morphological paradigms are learned by pairing syntactic categories and identifying common suffixes between them.…”
Section: Related Workmentioning
confidence: 99%
“…Lignos [53] develops an inference procedure that can learn the base form of a word when it is absent in the corpus. Can and Manandhar [14] propose a deterministic model that makes use of syntactic categories. Syntactic information and morphology are strongly connected to each other.…”
Section: Wordsmentioning
confidence: 99%
“…-Letter Successor Variety models: Harris [39], Hafer and Weiss [36], Dejean [26], Bordag [9,10] -Minimum Description Length based models: Brent et al [12], Goldsmith's Linguistica [30,31], Morfessor Baseline MDL [22], Argamon et al [1], Kazakov & Manandhar [42,43] -Other deterministic approaches: Bernhard [5], Neuvel and Fulow [60], Keshava and Pitler [44], Monson et al [57], Lignos et al [54], Can and Manandhar [14] -Maximum likelihood models: Morfessor Baseline ML [22], Morfessor Categories ML [23], Probabilistic ParaMor [58] -Maximum A-Posteriori models: Morfessor Categories MAP [24] -Bayesian parametric models: Creutz [19], Poon et al [62] -Bayesian non-parametric models: Goldwater et al [32], Can and Manandhar [15], Sirts and Alumäe [67], Dreyer and Eisner [28], Snyder and Barzilay [69] 2 Related work…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation