2020
DOI: 10.1007/s40995-020-01019-7
|View full text |Cite
|
Sign up to set email alerts
|

Performance of Some New Ridge Parameters in Two-Parameter Ridge Regression Model

Abstract: Two-parameter ridge regression is a widely used method in the last two decades to circumvent the problem of multicollinearity. Ridge parameter k plays an important role in such situations. Several methods are available in the literature for the estimation of ridge parameter. For high multicollinearity, the available methods do not perform well in terms of mean square error. In this article, we propose some new estimators for the ridge parameter. Based on simulation study, our new estimators generally perform b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(2 citation statements)
references
References 20 publications
0
2
0
Order By: Relevance
“…Then, the diagonal element r ij of matrix is the variance expansion factor of ridge estimation. According to experience, when k makes all r ij ⩽ 10, the ridge estimation will be relatively stable [22]. In a stable case, the error of ridge estimation is often greater than that of least squares estimation [23].…”
Section: Ridge Estimation-sqp Algorithmmentioning
confidence: 99%
“…Then, the diagonal element r ij of matrix is the variance expansion factor of ridge estimation. According to experience, when k makes all r ij ⩽ 10, the ridge estimation will be relatively stable [22]. In a stable case, the error of ridge estimation is often greater than that of least squares estimation [23].…”
Section: Ridge Estimation-sqp Algorithmmentioning
confidence: 99%
“…Some contemporary research also presented on two parameter estimator to cope with the problem of multicollinearity. These authors include References 5–22. Liu 13 identified that in case of severe multicollinearity, the RRE and the LE may not fully address the ill‐conditioned problem and suggested a two parameter LE by extending Equation () as goodbreak−k1false/2dβtrue^OLSgoodbreak=βgoodbreak+ε,0.6emgoodbreak−normal∞dnormal∞,0.36emk>0$$ -{k}^{-1/2}d{\hat{\beta}}_{OLS}=\beta +\varepsilon, \kern0.6em -\infty \le d\le \infty, \kern0.36em k>0 $$ and the following two parameter LE is obtained: trueβ^(k,d)goodbreak=XX+kI1()Xygoodbreak−dβ*,0.36emk>0,goodbreak−normal∞<d<normal∞,$$ \hat{\beta}\left(k,d\right)={\left({X}^{\prime }X+ kI\right)}^{-1}\left({X}^{\prime }y-d{\beta}^{\ast}\right),\kern0.36em k>0,&#x002D;&#x0005C;infty <d<\infty, $$ where k is the ridge parameter that control the condition number of XX+italickI$$ {X}^{\prime }X+ kI $$ to the desired level and d is the Liu parameter that is used to improve the fit and β*$$ {\beta}^{\ast } $$ can be any estimator of β$$ \beta $$.…”
Section: Introductionmentioning
confidence: 99%