2021
DOI: 10.1007/s42081-021-00121-3
|View full text |Cite
|
Sign up to set email alerts
|

Information criteria and cross validation for Bayesian inference in regular and singular cases

Abstract: In data science, an unknown information source is estimated by a predictive distribution defined from a statistical model and a prior. In an older Bayesian framework, it was explained that the Bayesian predictive distribution should be the best on the assumption that a statistical model is convinced to be correct and a prior is given by a subjective belief in a small world. However, such a restricted treatment of Bayesian inference cannot be applied to highly complicated statistical models and learning machine… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 35 publications
1
9
0
Order By: Relevance
“…A common model selection approach for ecological models is to use information criterion, notably Akaike information criterion (AIC) and its variants (including the method of Bengtsson and Cavanaugh (2006) for SSMs, although this is relatively complex and computationally demanding) within classical analyses. The Watanabe-Akaike information criterion (WAIC; Watanabe, 2013Watanabe, , 2021Watanabe & Opper, 2010) is the preferred criterion for Bayesian analyses. Alternatively, within the Bayesian framework, posterior model probabilities (and the related idea of Evidence; Finke et al (2019)) may be calculated to provide a quantitative comparison of competing models, and permit model-averaged estimates.…”
Section: Model Selectionmentioning
confidence: 99%
“…A common model selection approach for ecological models is to use information criterion, notably Akaike information criterion (AIC) and its variants (including the method of Bengtsson and Cavanaugh (2006) for SSMs, although this is relatively complex and computationally demanding) within classical analyses. The Watanabe-Akaike information criterion (WAIC; Watanabe, 2013Watanabe, , 2021Watanabe & Opper, 2010) is the preferred criterion for Bayesian analyses. Alternatively, within the Bayesian framework, posterior model probabilities (and the related idea of Evidence; Finke et al (2019)) may be calculated to provide a quantitative comparison of competing models, and permit model-averaged estimates.…”
Section: Model Selectionmentioning
confidence: 99%
“…Eq. ( 4.57) indicates the proportionality between the response of the learning error and the variance of the estimated parameter at equilibrium, which can be regarded as a fluctuation-response relationship described by GAMP [20].…”
Section: Expression Of Gdf By Gampmentioning
confidence: 99%
“…When a leverage sample point is contained in a sample, the importance sampling cross validation Eq. ( 20) becomes unstable [13,33], and the difference between LOOCV and WAIC becomes larger [52,55]. The improved version of numerical calculation of the cross validation was proposed in [39].…”
Section: Renormalized Posterior Distributionmentioning
confidence: 99%
“…For example, in regression problems where the conditional probability distribution q(y|x) of an output Y for a given input X , the input samples {X i } may dependent or fixed and {Y i } are conditionally independent. In such cases, LOOCV does not estimate the conditional generalization loss, whereas WAIC does [52,55].…”
Section: Renormalized Posterior Distributionmentioning
confidence: 99%