2022
DOI: 10.1002/cpe.6803
|View full text |Cite
|
Sign up to set email alerts
|

Combining principal component and robust ridge estimators in linear regression model with multicollinearity and outlier

Abstract: The method of least squared suffers a setback when there is multicollinearity and outliers in the linear regression model. In this article, we developed a new estimator to jointly handle multicollinearity and outliers by pooling the following estimators together: the M-estimator, the principal component and the ridge estimator. The new estimator is called the robust r-k estimator and is employed. We established theoretically that the new estimator is better than some of the existing ones. The simulation studie… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(11 citation statements)
references
References 39 publications
(106 reference statements)
0
3
0
Order By: Relevance
“…OLS minimizes y22 subject to an L2 norm with respect to bold-italicβ but fails to give a unique estimate in high dimensional settings when p > n 21 . Another threat to the performance of OLS is multicollinearity, which surfaces as a result of the correlation or linear dependency among the predictors 22–27 . Biased estimators such as the ridge regression estimator, 28 the Liu estimator, 29 modified ridge‐type estimator, 30 the Kibria–Lukman (KL) estimator, 31 robust principle component (PC)‐ridge estimator, 24 JKL estimator, 22 and others were developed to account for multicollinearity problem in linear regression models.…”
Section: Theoretical Backgroundmentioning
confidence: 99%
See 1 more Smart Citation
“…OLS minimizes y22 subject to an L2 norm with respect to bold-italicβ but fails to give a unique estimate in high dimensional settings when p > n 21 . Another threat to the performance of OLS is multicollinearity, which surfaces as a result of the correlation or linear dependency among the predictors 22–27 . Biased estimators such as the ridge regression estimator, 28 the Liu estimator, 29 modified ridge‐type estimator, 30 the Kibria–Lukman (KL) estimator, 31 robust principle component (PC)‐ridge estimator, 24 JKL estimator, 22 and others were developed to account for multicollinearity problem in linear regression models.…”
Section: Theoretical Backgroundmentioning
confidence: 99%
“…Another threat to the performance of OLS is multicollinearity, which surfaces as a result of the correlation or linear dependency among the predictors 22–27 . Biased estimators such as the ridge regression estimator, 28 the Liu estimator, 29 modified ridge‐type estimator, 30 the Kibria–Lukman (KL) estimator, 31 robust principle component (PC)‐ridge estimator, 24 JKL estimator, 22 and others were developed to account for multicollinearity problem in linear regression models. The ridge estimator 28 minimizes y22 subject to an L2‐norm of the coefficients and is defined as follows: bold-italicβtruênkgoodbreak=arg0.35emminβp{}y22goodbreak+kbold-italicβ22goodbreak=bold-italicXTX+kIp1XTbold-italicybold, where k > 0 is the tuning parameter and Ip is the identity matrix with … and is better than the OLS estimator.…”
Section: Theoretical Backgroundmentioning
confidence: 99%
“…Outliers are data points that differ significantly from other observations and can have a substantial impact on model estimates 8 , 9 . They threatened the efficiency of the OLS estimator 8 11 , and it is well-known that robust estimators are preferred when dealing with outliers 12 19 . However, both multicollinearity and outliers can exist simultaneously in a model.…”
Section: Introductionmentioning
confidence: 99%
“…All of the aforementioned techniques behave differently when an outlier is present. It results in a discernible modification of model estimations, including forecasted values, estimated variation, and others [20][21][22]. Additionally, the presence of an outlier observation violates the premise of normality assumptions [20,21].…”
Section: Introductionmentioning
confidence: 99%
“…Additionally, the presence of an outlier observation violates the premise of normality assumptions [20,21]. Knowing that the estimator's breakdown point is extremely low, for example, one outlier figure had a significant impact on TLS's performance [20][21][22]. When the LRM is tainted by extreme values or significant observations, robust approach is a viable alternative to TLS.…”
Section: Introductionmentioning
confidence: 99%