We propose a mathematical framework for a unification of the distributional theory of meaning in terms of vector space models, and a compositional theory for grammatical types, namely Lambek's pregroup semantics. A key observation is that the monoidal category of (finite dimensional) vector spaces, linear maps and the tensor product, as well as any pregroup, are examples of compact closed categories. Since, by definition, a pregroup is a compact closed category with trivial morphisms, its compositional content is reflected within the compositional structure of any non-degenerate compact closed category. The (slightly refined) category of vector spaces enables us to compute the meaning of a compound well-typed sentence from the meaning of its constituents, by 'lifting' the type reduction mechanisms of pregroup semantics to the whole category. These sentence meanings live in a single space, independent of the grammatical structure of the sentence. Hence we can use the inner-product to compare meanings of arbitrary sentences. A variation of this procedure which involves constraining the scalars of the vector spaces to the semiring of Booleans results in the well-known Montague semantics.