The use of parameters in the description of natural language syntax has to balance between the need to discriminate among (sometimes subtly different) languages, which can be seen as a cross-linguistic version of Chomsky's descriptive adequacy (Chomsky, 1964), and the complexity of the acquisition task that a large number of parameters would imply, which is a problem for explanatory adequacy. Here we first present a novel approach in which machine learning is used to detect hidden dependencies in a table of parameters. The result is a dependency graph in which some of the parameters can be fully predicted from others. These findings can be then subjected to linguistic analysis, which may either refute them by providing typological counterexamples of languages not included in the original dataset, dismiss them on theoretical grounds, or uphold them as tentative empirical laws worth of further study. Machine learning is also used to explore the full sets of parameters that are sufficient to distinguish one historically established language family from others. These results provide a new type of empirical evidence about the historical adequacy of parameter theories.
In this paper we revisit and revise the typology of multiple questions and multiple wh-fronting (MWF) in the light of data from Romeyka, a Greek variety spoken in Pontus, Turkey, and from another Pontic Greek variety spoken in northern Greece. Both varieties provide evidence for wh-fronting as focus movement, their most striking feature being the availability of single-pair interpretations in spite of strict Superiority. It turns out that the parametric system deriving the space of variation in multiple wh-fronting must be extended to accommodate the facts presented here, which seem to instantiate a further type of MWF (with a corresponding type of non-MWF languages), not predicted by the existing typology. At the same time, put in a crosslinguistic perspective, the Romeyka facts may help us uncover independent restrictions on the possibilities that this parametric system makes available. We propose that the availability of peripheral positions and their activation in the left or low periphery may be a point of parametric variation. Furthermore, still complying with Bošković's (2007) theory of Attract-1/all, certain Focus heads can be Attract-1, thus deriving the compatibility of Superiority with single pair readings. Finally, we present some speculations about a potential correlation between word order/head directionality in the clausal domain and the kind of information structure-related head (e.g. Topic vs. Focus) that can take on an Attract-1 feature.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.