Estimating the unconstrained mean and covariance matrix is a popular topic in statistics. However, estimation of the parameters of N p (µ, Σ) under joint constraints such as Σµ = µ has not received much attention. It can be viewed as a multivariate counterpart of the classical estimation problem in the N (θ , θ 2 ) distribution. In addition to the usual inference challenges under such non-linear constraints among the parameters (curved exponential family), one has to deal with the basic requirements of symmetry and positive definiteness when estimating a covariance matrix. We derive the non-linear likelihood equations for the constrained maximum likelihood estimator of (µ, Σ) and solve them using iterative methods. Generally, the MLE of covariance matrices computed using iterative methods do not satisfy the constraints. We propose a novel algorithm to modify such (infeasible) estimators or any other (reasonable) estimator. The key step is to re-align the mean vector along the eigenvectors of the covariance matrix using the idea of regression. In using the Lagrangian function for constrained MLE (Aitchison and Silvey, 1958), the Lagrange multiplier entangles with the parameters of interest and presents another computational challenge. We handle this by either iterative or explicit calculation of the Lagrange multiplier. The existence and nature of location of the constrained MLE are explored within a data-dependent convex set using recent results from random matrix theory. A simulation study illustrates our methodology and shows that the modified estimators perform better than the initial estimators from the iterative methods.