2011
DOI: 10.1007/978-3-642-22221-4_8
|View full text |Cite
|
Sign up to set email alerts
|

Well-Nestedness Properly Subsumes Strict Derivational Minimalism

Abstract: Minimalist grammars (MGs) constitute a mildly contextsensitive formalism when being equipped with a particular locality condition (LC), the shortest move condition. In this format MGs define the same class of derivable string languages as multiple context-free grammars (MCFGs). Adding another LC to MGs, the specifier island condition (SPIC), results in a proper subclass of derivable languages. It is rather straightforward to see this class is embedded within the class of languages derivable by some well-nested… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2011
2011
2016
2016

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 26 publications
0
4
0
Order By: Relevance
“…First, note that the strings themselves are enough to infer the "movement" structure but only when there is a fairly explicit clue in at least some of the strings that exhibit this structure. Secondly, wellnestedness (Kuhlmann and Möhl, 2007;Kanazawa et al, 2011) seems to make things a lot simpler; this tends to support Kanazawa et al (2011)'s arguments for well-nestedness as a condition on mildly contextsensitive language formalisms to go with the parsing e ciency arguments (Gómez-Rodríguez et al, 2010) and the corpus based arguments of, among others, Kuhlmann and Nivre (2006). Note that for example MIX (Kanazawa and Salvati, 2012) is not in the class of languages defined; the strong learning classes are much more restricted in linguistically significant ways that the weak learning algorithms considered in Hao (2019).…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…First, note that the strings themselves are enough to infer the "movement" structure but only when there is a fairly explicit clue in at least some of the strings that exhibit this structure. Secondly, wellnestedness (Kuhlmann and Möhl, 2007;Kanazawa et al, 2011) seems to make things a lot simpler; this tends to support Kanazawa et al (2011)'s arguments for well-nestedness as a condition on mildly contextsensitive language formalisms to go with the parsing e ciency arguments (Gómez-Rodríguez et al, 2010) and the corpus based arguments of, among others, Kuhlmann and Nivre (2006). Note that for example MIX (Kanazawa and Salvati, 2012) is not in the class of languages defined; the strong learning classes are much more restricted in linguistically significant ways that the weak learning algorithms considered in Hao (2019).…”
Section: Discussionmentioning
confidence: 99%
“…If we restrict the functions to these sets then we define a normal form for the class of well-nested MCFGs of dimension 2; in the sense that for every language generated by a well-nested MCFG of dimension 2, there is a grammar in this class that generates the same language, that uses only productions with functions in these sets, as can be demonstrated using standard techniques (e.g. Kanazawa et al (2011)), adapted from the standard approaches for converting CFGs into Chomsky normal form. Indeed although we allow in this paper productions of rank greater than 2, restricting productions to those of rank 2 does not change the set of languages generated as these well-nested grammars can be binarised.…”
Section: Grammarsmentioning
confidence: 99%
See 1 more Smart Citation
“…MG[+SpIC] and variants have been studied in a series of papers (Gärtner & Michaelis, ; Kanazawa, Michaelis, Salvati, & Yoshinaka, ; Kobele & Michaelis, ; Michaelis, ; Michaelis, ; Stabler, ). Recently, Kobele and Michaelis () show that blocking extraction from specifiers formed by merge reduces expressive power, while prohibiting only extractions from inside specifiers formed by move does not.…”
Section: Comparisons and Extensionsmentioning
confidence: 99%
“…Salvati suggests that the languages characterized by the well-nested subset of MCFG, MCFG WN might formally correspond to the set of MCS languages, and shows that TAG is weakly in MCFG WN , while Kanazawa and Salvati (2012) show that MIX is not in MCFL WN . However, the MCFGs to which Stabler's MG corresponds are known to be non-well-nested (Boston et al, 2010), (although Kanazawa et al, 2011 show that the addition of a further Specifier Island Contraint to MG restricts them to a subset of MCFG WN ).…”
Section: Child Language Acquisitionmentioning
confidence: 99%