2001
DOI: 10.1162/002438901554586
|View full text |Cite
|
Sign up to set email alerts
|

Empirical Tests of the Gradual Learning Algorithm

Abstract: The Gradual Learning Algorithm (Boersma 1997) is a constraint-ranking algorithm for learning optimality-theoretic grammars. The purpose of this article is to assess the capabilities of the Gradual Learning Algorithm, particularly in comparison with the Constraint Demotion initiated the learnability research program for Optimality Theory. We argue that the Gradual Learning Algorithm has a number of special advantages: it can learn free variation, deal effectively with noisy learning data, and account for gradie… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
594
0
28

Year Published

2004
2004
2017
2017

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 637 publications
(644 citation statements)
references
References 13 publications
(9 reference statements)
4
594
0
28
Order By: Relevance
“…In contrast, vowels are shortened in the same segmental context when no morphological boundary intervenes, e.g., in brood (Aitken, 1981;Scobbie et al, 1999;Scobbie & Stuart-Smith, 2008). Similarly, for some accents of American English, /l/-darkening is reported to apply in canonical coda positions, but also pre-vocalically before a morphological boundary, yielding a contrast between words like hail-y and Hailey (Boersma & Hayes, 2001;Lee-Kim et al, 2013). An even more striking example, since it involves high frequency words and highly productive suffixation, involves day-s and daze in Belfast English, where the latter is pronounced with a centring diphthong, while the former has a more monophthongal quality (Harris, 1994).…”
Section: Introductionmentioning
confidence: 92%
“…In contrast, vowels are shortened in the same segmental context when no morphological boundary intervenes, e.g., in brood (Aitken, 1981;Scobbie et al, 1999;Scobbie & Stuart-Smith, 2008). Similarly, for some accents of American English, /l/-darkening is reported to apply in canonical coda positions, but also pre-vocalically before a morphological boundary, yielding a contrast between words like hail-y and Hailey (Boersma & Hayes, 2001;Lee-Kim et al, 2013). An even more striking example, since it involves high frequency words and highly productive suffixation, involves day-s and daze in Belfast English, where the latter is pronounced with a centring diphthong, while the former has a more monophthongal quality (Harris, 1994).…”
Section: Introductionmentioning
confidence: 92%
“…The first learner to be considered here is the basic GLA approach (Boersma 1997, Boersma andHayes (2001). As already laid out in the introduction, this learner will not pass through an IF stage like the ones discussed in section 2, at least not unassisted.…”
Section: Frequency Of Violation and The Problem Of If Stagesmentioning
confidence: 99%
“…Grammar learning therefore involves a learner that, given the overt forms of a language, and a set of well-formedness constraints, searches through a space of possible grammars (i.e., possible lexicon/ranking combinations) and selects a grammar that is consistent with the overt data and meets certain additional criteria or learning-theoretic biases. The learning processes below define the core operations of the assumed model, adapted largely from models of phonotactic learning, (Boersma and Hayes, 2001), (Hayes, 2004) (Prince and Tesar, 2004), and (Tesar and Smolensky, 2000).…”
Section: Basic Assumptions Of the Learning Modelmentioning
confidence: 99%