In this paper, we introduce an uplifted reduced order modeling (UROM) approach through the integration of standard projection based methods with long short-term memory (LSTM) embedding. Our approach has three modeling layers or components. In the first layer, we utilize an intrusive projection approach to model dynamics represented by the largest modes. The second layer consists of an LSTM model to account for residuals beyond this truncation. This closure layer refers to the process of including the residual effect of the discarded modes into the dynamics of the largest scales. However, the feasibility of generating a low rank approximation tails off for higher Kolmogorov n-width systems due to the underlying nonlinear processes. The third uplifting layer, called superresolution, addresses this limited representation issue by expanding the span into a larger number of modes utilizing the versatility of LSTM. Therefore, our model integrates a physics-based projection model with a memory embedded LSTM closure and an LSTM based super-resolution model. In several applications, we exploit the use of Grassmann manifold to construct UROM for unseen conditions. We performed numerical experiments by using the Burgers and Navier-Stokes equations with quadratic nonlinearity. Our results show robustness of the proposed approach in building reduced order models for parameterized systems and confirm the improved trade-off between accuracy and efficiency.Keywords Hybrid analysis and modeling · Supervised machine learning · Long short-term memory · Model reduction · Galerkin projection · Grassmann manifold.Reduced order models (ROMs) have shown great success for prototypical problems in different fields. In particular, Galerkin projection (GP) coupled with proper orthogonal decomposition (POD) capability to extract the most energetic modes has been used to build ROMs for linear and nonlinear systems [22][23][24][25][26][27][28][29]. These ROMs preserve sufficient arXiv:1912.06756v1 [physics.flu-dyn]