2016
DOI: 10.18187/pjsor.v12i2.1188
|View full text |Cite
|
Sign up to set email alerts
|

A Comparative Study on the Performance of New Ridge Estimators

Abstract: Least square estimators in multiple linear regressions under multicollinearity become unstable as they produce large variance for the estimated regression coefficients. Hoerl and Kennard 1970, developed ridge estimators for cases of high degree of collinearity. In ridge estimation, the estimation of ridge parameter ( k ) is vital. In this article new methods for estimating ridge parameter are introduced. The performance of the proposed estimators is investigated through mean square errors (MSE). Monte-Carlo si… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(4 citation statements)
references
References 15 publications
0
4
0
Order By: Relevance
“…The results of the simulation are seen in Tables (1)(2)(3)(4)(5)(6)(7)(8)(9)(10)(11)(12). the results were analyzed by splitting the results into two sections as follow:…”
Section: The Discussion Of Simulation Resultsmentioning
confidence: 99%
“…The results of the simulation are seen in Tables (1)(2)(3)(4)(5)(6)(7)(8)(9)(10)(11)(12). the results were analyzed by splitting the results into two sections as follow:…”
Section: The Discussion Of Simulation Resultsmentioning
confidence: 99%
“…Further for large n, say n ≥ 50, low error variance σ 2 (≤ 5), low and moderate degree of correlations (ρ), all the estimators considered here have produced unstable estimates for the ridge parameter. Estimators due to k 1 i.e., Hoerl et al [18], Satish and Vidya [20,21,24] i.e., k 7 , k 8 and k 9 behaved better, and yielded more stable estimates to the regression coefficients, but it is observed carefully that, these estimators slightly over shrinks the estimates to the regression coefficients as compared to other estimators due to Dorugade and Kashid [15], Satish and Vidya [24] i.e., k 10 , k 11, and, k 12 ; and the proposed estimators k 13 and k 14 .…”
Section: Discussionmentioning
confidence: 99%
“…, is the second largest eigen value of X"X matrix. It is observed that the estimators defined in equations (7) to (12) are verified under very high degree (ρ ≥ 0.9) of multicollinearity between the predictors whereas, the estimators due to Satish and Vidya [20,21,24] are investigated under various degree of multicollinearity viz., low, moderate and high degree of multicollinearity. Also, Satish and Vidya [20,24], have considered different error distributions viz., normal and non-normal (t (5) -distribution with 5 d.f.)…”
Section: Some Well-known Ridge Estimatorsmentioning
confidence: 99%
“…Different techniques for estimating the ridge parameters have been developed in literature. These include those proposed by Hoerl and Kennard (1970), McDonald and Galarneau (1975), Lawless and Wang (1976), Dempster et al (1977), Gibbons (1981), Kibria (2003), Khalaf and Shukur (2005), Alkhamisi et al (2006), Alkhamisi and Shukur (2008), Batach et al (2008), Muniz and Kibria (2009), Dorugade and Kashid (2010), Mansson et al (2010), Khalaf (2013), Ghadhan and Mohamed (2014), Kibria and Shipra (2016), Bhat (2016), Lukman and Ayinde (2017), Fayose and Ayinde (2019). Algama (2018a) proposed methods of selecting biasing parameters in Generalized Linear Models (GLM) and also in Algama (2018b) some modified versions of ridge parameter estimators for gamma models were proposed.…”
Section: Introductionmentioning
confidence: 99%