2021
DOI: 10.1109/access.2021.3125000
|View full text |Cite
|
Sign up to set email alerts
|

Meta-Optimization of Bias-Variance Trade-Off in Stochastic Model Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
3

Relationship

2
7

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 21 publications
0
3
0
Order By: Relevance
“…A high-variance problem occurs when a machine learning model is trained on a dataset with unvalued features or noise; the model trains very well but provides incorrect results when applied to new data. When the variance in the machine learning model is high and the bias is low, the machine learning model suffers from the underfitting problem [21][22]. Figure 3.0 underlines the high bias that is described in the following.…”
Section: Bias-variance Tradeoffmentioning
confidence: 89%
“…A high-variance problem occurs when a machine learning model is trained on a dataset with unvalued features or noise; the model trains very well but provides incorrect results when applied to new data. When the variance in the machine learning model is high and the bias is low, the machine learning model suffers from the underfitting problem [21][22]. Figure 3.0 underlines the high bias that is described in the following.…”
Section: Bias-variance Tradeoffmentioning
confidence: 89%
“…In fact, in highway-env simulations, we observed cases where other blue cars could not be reconstructed as in the standard VAE, depending on the value of q 2 . A framework that is adaptive or robust to this trade-off, such as meta-optimization [32] or ensemble learning with multiple combinations of hyperparameters [33], would be useful.…”
Section: Discussion For Future Workmentioning
confidence: 99%
“…Therefore, in order to avoid overfitting, we used complexity regularization to address the bias–variance trade-off. The interested reader may be referred to references [ 41 , 42 , 43 ] for a detailed explanation on the bias–variance trade-off and potential model selection options (e.g., hold-out, k -fold cross-validation, structural risk minimization, complexity regularization, and information criteria).…”
Section: Proposed Approach Materials and Methodsmentioning
confidence: 99%