2019
DOI: 10.19113/sdufenbed.484275
|View full text |Cite
|
Sign up to set email alerts
|

A Comparison of Different Ridge Parameters under Both Multicollinearity and Heteroscedasticity

Abstract: One of the major problems in fitting an appropriate linear regression model is multicollinearity which occurs when regressors are highly correlated. To overcome this problem, ridge regression estimator which is an alternative method to the ordinary least squares (OLS) estimator, has been used. Heteroscedasticity, which violates the assumption of constant variances, is another major problem in regression estimation. To solve this violation problem, weighted least squares estimation is used to fit a more robust … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 31 publications
(25 reference statements)
0
1
0
Order By: Relevance
“…Ridge Regression decreases the standard errors when a degree of bias is added to the Regression Estimates. Several authors have addressed Ridge Regression as in [1][2][3][4][5][6][7][8][9][10].…”
Section: Introductionmentioning
confidence: 99%
“…Ridge Regression decreases the standard errors when a degree of bias is added to the Regression Estimates. Several authors have addressed Ridge Regression as in [1][2][3][4][5][6][7][8][9][10].…”
Section: Introductionmentioning
confidence: 99%