2016
DOI: 10.1111/rssb.12182
|View full text |Cite
|
Sign up to set email alerts
|

A Frequentist Approach to Computer Model Calibration

Abstract: This paper considers the computer model calibration problem and provides a general frequentist solution. Under the proposed framework, the data model is semi-parametric with a nonparametric discrepancy function which accounts for any discrepancy between the physical reality and the computer model. In an attempt to solve a fundamentally important (but often ignored) identifiability issue between the computer model parameters and the discrepancy function, this paper proposes a new and identifiable parametrizatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
102
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 82 publications
(102 citation statements)
references
References 34 publications
0
102
0
Order By: Relevance
“…In contrast to ad hoc, frequentist, or otherwise non-Bayesian calibration frameworks (14), the theory underlying Bayesian approaches provides an axiomatic basis for deciding how to quantify evidence, avoiding arbitrary decisions about the relative weight to be placed on different data sources or the use of heuristics to select well-fitting parameter sets. In contexts where priors, likelihood, and model are all correctly specified, the Bayesian approach can provide a theoretically optimal summary of the evidence (5, 6), allowing a decision-maker to maximize expected utility when paired with a utility function representing his/her preferences for different outcomes (7).…”
Section: Theoretical Frameworkmentioning
confidence: 99%
“…In contrast to ad hoc, frequentist, or otherwise non-Bayesian calibration frameworks (14), the theory underlying Bayesian approaches provides an axiomatic basis for deciding how to quantify evidence, avoiding arbitrary decisions about the relative weight to be placed on different data sources or the use of heuristics to select well-fitting parameter sets. In contexts where priors, likelihood, and model are all correctly specified, the Bayesian approach can provide a theoretically optimal summary of the evidence (5, 6), allowing a decision-maker to maximize expected utility when paired with a utility function representing his/her preferences for different outcomes (7).…”
Section: Theoretical Frameworkmentioning
confidence: 99%
“…These conditions are considerably looser than those considered in Tuo and Wu () and Wong et al . (), but consistency is a weaker claim than the efficiency claims in those works.…”
Section: Consistencymentioning
confidence: 83%
“…(), Tuo and Wu (), Plumlee () and Wong et al . () oflfalse(g,hfalse)=double-struckX{gfalse(xfalse)hfalse(xfalse)}2dμfalse(xfalse),where μ is some measure over double-struckX. This is a strictly proper scoring metric that weights the relative inputs by some measure μ .…”
Section: Setting and Notationmentioning
confidence: 99%
See 1 more Smart Citation
“…As such, we assume some level of familiarity with tools such as noise modeling, parametric bootstrap analyses, and synthetic dataset generation. [20][21][22][23][24] In a few instances we review key ideas, but interested readers should consult the indicated references for detailed information.…”
mentioning
confidence: 99%