Matrices satisfying the Restricted Isometry Property (RIP) play an important role in the areas of compressed sensing and statistical learning. RIP matrices with optimal parameters are mainly obtained via probabilistic arguments, as explicit constructions seem hard. In this paper, we try to bridge this gap between random and deterministic designs by introducing a new model for restricted isometry designs that incorporates a fixed matrix into the construction. Our construction starts with a fixed (deterministic) matrix X satisfying some simple stable rank condition, and we show that the matrix XR, where R is a random matrix drawn from various popular probabilistic models (including, subgaussian, sparse, low-randomness, satisfying convex concentration property), satisfies the RIP with high probability. These theorems have various applications in signal recovery, deep learning, random matrix theory, dimensionality reduction, etc. Additionally, motivated by an application for understanding the effectiveness of word vector embeddings popular in natural language processing and machine learning applications, we investigate the RIP of the matrix XR (ℓ) where R (ℓ) is formed by taking all possible (disregarding order) ℓ-way entrywise products of the columns of a random matrix R.