“…where the response variable y i is a n  1 vector, the predictors x i ℝ nÂp , i ¼ 1, …, n, is a known n  p matrix of predictors such that p > n, the regression coefficient β is a p  1 vector, and ϵ i , is the random error vector assumed to be normally distributed with mean 0 and variance σ 2 ℝ þ , i ¼ 1, …, n: In low dimensional settings when p < n, β is popularly estimated using the least squares estimator (OLS). OLS minimizes y À Xβ k k 2 2 subject to an L2 norm with respect to β but fails to give a unique estimate in high dimensional settings when p > n. 21 Another threat to the performance of OLS is multicollinearity, which surfaces as a result of the correlation or linear dependency among the predictors. [22][23][24][25][26][27] Biased estimators such as the ridge regression estimator, 28 the Liu estimator, 29 modified ridge-type estimator, 30 the Kibria-Lukman (KL) estimator, 31 robust principle component (PC)-ridge estimator, 24 JKL estimator, 22 and others were developed to account for multicollinearity problem in linear regression models.…”