Minimalist grammars (MGs) and multiple context-free grammars (MCFGs) are weakly equivalent in the sense that they define the same languages, a large mildly context-sensitive class that properly includes context-free languages. But in addition, for each MG, there is an MCFG which is strongly equivalent in the sense that it defines the same language with isomorphic derivations. However, the structure-building rules of MGs but not MCFGs are defined in a way that generalizes across categories. Consequently, MGs can be exponentially more succinct than their MCFG equivalents, and this difference shows in parsing models too. An incremental, top-down beam parser for MGs is defined here, sound and complete for all MGs, and hence also capable of parsing all MCFG languages. But since the parser represents its grammar transparently, the relative succinctness of MGs is again evident. Although the determinants of MG structure are narrowly and discretely defined, probabilistic influences from a much broader domain can influence even the earliest analytic steps, allowing frequency and context effects to come early and from almost anywhere, as expected in incremental models.Keywords: Grammar; Parsing; Minimalist grammar; Succinctness; Multiple context-free grammar A psychological model is not adequate if a response, any response really due to the mechanism being modeled, is simply not in the range of the model. But we compare two models that agree on the range of behaviors to be modeled; in fact, suppose their input/ output behaviors are provably identical. Then can there be a reason to prefer one over the other? Yes. It is a familiar fact that very different algorithms, with very different data structures, can compute exactly the same function. And in such cases, it can matter which one is implemented. Since recent mathematical work on grammars has established a wide range of equivalence results, comparisons of models that are in some relevant sense Correspondence should be sent to Edward. P. Stabler, UCLA Linguistics,