2023
DOI: 10.1016/j.kjs.2023.02.013
|View full text |Cite
|
Sign up to set email alerts
|

Condition-index based new ridge regression estimator for linear regression model with multicollinearity

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(4 citation statements)
references
References 13 publications
0
3
0
Order By: Relevance
“…Within the domain of scale-related variables, operational conditions and influent characteristics were interconnected because of biochemical treatment mechanisms . During the spatial modeling, ridge regression’s ability to address multicollinearity issues effectively suppressed the adverse effects of input features on predictions …”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Within the domain of scale-related variables, operational conditions and influent characteristics were interconnected because of biochemical treatment mechanisms . During the spatial modeling, ridge regression’s ability to address multicollinearity issues effectively suppressed the adverse effects of input features on predictions …”
Section: Resultsmentioning
confidence: 99%
“…45 During the spatial modeling, ridge regression's ability to address multicollinearity issues effectively suppressed the adverse effects of input features on predictions. 46 Conversely, for temporal modeling, the situation highly differed. In this case, scale-related variables had a more normal impact, while quality-related variables carried greater weight compared to the spatial data set.…”
Section: Spatial and Temporal Modeling 321 Comparison On Combinations...mentioning
confidence: 99%
“…By driving some coefficients to zero, LASSO regression can effectively identify the most important predictors and discard irrelevant ones. On the other hand, Ridge regression is applied to estimate regression parameters while avoiding multicollinearity among independent variables, 50 expressed as:where is the regularization weight.…”
Section: Methodsmentioning
confidence: 99%
“…As λ increases, the intensity of regularization increases and the complexity of the model decreases. Through cross-validation, we can find the best λ value that minimizes the cross-validation error [23][24][25][26]. In this paper, we use five-fold cross-validation to get the best λ value under different datasets, as shown in Table 8.…”
Section: Modelingmentioning
confidence: 99%