2015
DOI: 10.1002/kin.20906
|View full text |Cite
|
Sign up to set email alerts
|

On the Statistical Calibration of Physical Models

Abstract: We introduce a novel statistical calibration framework for physical models, relying on probabilistic embedding of model discrepancy error within the model. For clarity of illustration, we take the measurement errors out of consideration, calibrating a chemical model of interest with respect to a more detailed model, considered as "truth" for the present purpose. We employ Bayesian statistical methods for such model-to-model calibration and demonstrate their capabilities on simple synthetic models, leading to a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
137
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 107 publications
(138 citation statements)
references
References 79 publications
(119 reference statements)
1
137
0
Order By: Relevance
“…For optimal utilization of the method, all relevant sources of uncertainties for each of the data types should be considered in the interpretation of the data. These include both “parametric uncertainties” (quantitative uncertainties in the model description due to uncertainties in the model parameters) and “structural uncertainties” (qualitative uncertainties in the model description due to assumptions/limitations of the model itself, also termed “model inadequacy” or “model discrepancy error” ).…”
Section: Multiscale Informatics Approachmentioning
confidence: 99%
“…For optimal utilization of the method, all relevant sources of uncertainties for each of the data types should be considered in the interpretation of the data. These include both “parametric uncertainties” (quantitative uncertainties in the model description due to uncertainties in the model parameters) and “structural uncertainties” (qualitative uncertainties in the model description due to assumptions/limitations of the model itself, also termed “model inadequacy” or “model discrepancy error” ).…”
Section: Multiscale Informatics Approachmentioning
confidence: 99%
“…In this context, the parameters bold-italicθ={Cμε,Cε}, the data scriptD is represented by z ={ f k , i | i = 1,…, N t }, while bold-italicx={f0.3emP,f0.3emD} are independent variables. In general, the discrepancy between model predictions and the data can be formalized as bold-italicz=m(bold-italicx;bold-italicθ)+εz Here, the model m(bold-italicx;bold-italicθ)=Cμεf0.3emPCεf0.3emD and ε z is the discrepancy between the data and the model, that is, a consequence of the model only being an approximation of the true process and any imperfections in the measurement process.…”
Section: Model Calibrationmentioning
confidence: 99%
“…The likelihood, expressed as LscriptD(bold-italicα1,bold-italicα2)=p(bold-italicz|bold-italicα1,bold-italicα2,bold-italicxbold-italica), is the multivariate density for z ={ f k , i | i = 1,…, N t }. Generally, this multivariate density has been shown to be degenerate . Instead, we approximate this density with a product of marginal densities corresponding to each data point LscriptD(bold-italicα1,bold-italicα2)=i=1Ntp(fk,i|bold-italicα1,bold-italicα2,bold-italicxbold-italica), Given that germs ξ 1 and ξ 2 in Equation are normal RVs and that the model, in Equation , is linear, the marginal densities p ( f k , i | α 1 , α 2 , x a ) are normal, with mean and variance given by μf=α10fPaα11fDaσf2=α20fPaα21fDa2+α22fDa2 …”
Section: Model Calibrationmentioning
confidence: 99%
See 1 more Smart Citation
“…Often, the impact of one uncertainty source is studied, but in order to make credible predictions, the greater effort required to aggregate multiple sources is necessary. Previously at Sandia, many pieces of the full rollup problem have been considered independently: the rollup of information from multiple validation points to the prediction conditions of interest [7], embedding model discrepancy into model parameter uncertainties during calibration to make better extrapolative predictions [8], and calibration of model parameters from multiple levels of model complexity using Bayesian networks [9]. Other methods of integrating uncertainty from different sources (i.e.…”
Section: Vvuq Workflow Applied To Applicationmentioning
confidence: 99%