This paper presents a Maximum Entropy learner of grammars and lexicons (MaxLex), and demonstrates that MaxLex has an emergent preference for minimally abstract underlying representations. In order to keep the weight of faithfulness constraints low, the learner attempts to fill gaps in the lexical distribution of segments, making the underlying segment inventory more feature-economic. Even when the learner only has access to individual forms, properties of the entire system are implicitly available through the relative weighting of constraints. These properties lead to a preference for some abstract underlying representations over others, mitigating the computational difficulty of searching a large set of abstract forms. MaxLex is shown to be successful in learning certain abstract underlying forms through simulations based on the [i]~[Ø] alternation in Klamath verbs. The Klamath pattern cannot be represented or learned using concrete underlying representations, but MaxLex successfully learns both the phonotactic patterns and minimally abstract underlying representations.
Whether phonological transformations in general are subregular is an open question. This is the case for most transformations, which have been shown to be subsequential, but it is not known whether weakly deterministic mappings form a proper subset of the regular functions. This paper demonstrates that there are regular functions that are not weakly deterministic, and, because all attested processes so far studied are weakly deterministic, supports the subregular hypothesis.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.