2008
DOI: 10.1162/ling.2008.39.3.379
|View full text |Cite
|
Sign up to set email alerts
|

A Maximum Entropy Model of Phonotactics and Phonotactic Learning

Abstract: The study of phonotactics is a central topic in phonology. We propose a theory of phonotactic grammars and a learning algorithm that constructs such grammars from positive evidence. Our grammars consist of constraints that are assigned numerical weights according to the principle of maximum entropy. The grammars assess possible words on the basis of the weighted sum of their constraint violations. The learning algorithm yields grammars that can capture both categorical and gradient phonotactic patterns. The al… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

14
689
2
7

Year Published

2010
2010
2023
2023

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 431 publications
(712 citation statements)
references
References 62 publications
14
689
2
7
Order By: Relevance
“…Indeed, Wilson (2006) implemented a computational version of the Pmap as a way to explain why learners extended novel alternations involving palatalization more often to contexts where it would be less perceptibly noticeable (before high vowels) compared to contexts where it would be more noticeable (before mid vowels). Using maximum entropy grammar models (see also Goldwater & Johnson, 2003;Hayes & Wilson, 2008), Wilson implemented the similarity bias as a prior. A similar approach could be taken to account for the results of the current study (as well as Skoruppa et al's (2011) results).…”
Section: Similarity Biasmentioning
confidence: 99%
“…Indeed, Wilson (2006) implemented a computational version of the Pmap as a way to explain why learners extended novel alternations involving palatalization more often to contexts where it would be less perceptibly noticeable (before high vowels) compared to contexts where it would be more noticeable (before mid vowels). Using maximum entropy grammar models (see also Goldwater & Johnson, 2003;Hayes & Wilson, 2008), Wilson implemented the similarity bias as a prior. A similar approach could be taken to account for the results of the current study (as well as Skoruppa et al's (2011) results).…”
Section: Similarity Biasmentioning
confidence: 99%
“…These results are relevant to the current debate of the nature of variation in at least three ways. First, a recent grammatical model, Maximum Entropy grammar, describes phonological patterns without faithfulness constraints (Hayes & Wilson 2008). Like Harmonic Grammar (Legendre et al 1990, Smolensky & Legendre 2006, Maximum Entropy grammars weight output constraints and compute well-formedness as a function of the weighted sum of constraint violations.…”
Section: Discussionmentioning
confidence: 99%
“…What is crucial instead is the violability of constraints. 4 It would be possible to model the Japanese patterns in other models of grammar which deploy violable constraints, such as Harmonic Grammar (Legendre et al, 1990;Pater, 2009, to appear) or MaxEnt Grammar (Goldwater & Johnson, 2003;Hayes & Wilson, 2008). This reply uses Optimality Theory because it is probably best-known-Labrune (2014) too presents an OptimalityTheoretic analysis to derive the quality of /r/ from its structural emptiness as well (pp.…”
Section: Resistance To Geminationmentioning
confidence: 99%