2014
DOI: 10.1073/pnas.1410183111
|View full text |Cite
|
Sign up to set email alerts
|

Testing for ontological errors in probabilistic forecasting models of natural systems

Abstract: Probabilistic forecasting models describe the aleatory variability of natural systems as well as our epistemic uncertainty about how the systems work. Testing a model against observations exposes ontological errors in the representation of a system and its uncertainties. We clarify several conceptual issues regarding the testing of probabilistic forecasting models for ontological errors: the ambiguity of the aleatory/epistemic dichotomy, the quantification of uncertainties as degrees of belief, the interplay b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
90
0

Year Published

2014
2014
2018
2018

Publication Types

Select...
9
1

Relationship

3
7

Authors

Journals

citations
Cited by 80 publications
(90 citation statements)
references
References 64 publications
0
90
0
Order By: Relevance
“…While in some cases this may be more computationally efficient, it may not be strictly correct according to the normal definition of those terms. Epistemic uncertainty, as defined by for example Marzocchi and Jordan (2014), is due to our lack of knowledge of the system, while aleatory uncertainty (or variability) is due to the intrinsic randomness of the system. In theory, epistemic uncertainty can be reduced by more knowledge about the system, but aleatory uncertainty cannot.…”
Section: Burbidge Et Al: Tsunami Uncertaintymentioning
confidence: 99%
“…While in some cases this may be more computationally efficient, it may not be strictly correct according to the normal definition of those terms. Epistemic uncertainty, as defined by for example Marzocchi and Jordan (2014), is due to our lack of knowledge of the system, while aleatory uncertainty (or variability) is due to the intrinsic randomness of the system. In theory, epistemic uncertainty can be reduced by more knowledge about the system, but aleatory uncertainty cannot.…”
Section: Burbidge Et Al: Tsunami Uncertaintymentioning
confidence: 99%
“…Indeed, this discussion is deeply rooted in the intrinsic meaning of probability (frequency versus degree of belief) and, more importantly, in the possibility of validating a probabilistic assessment like the outcome of probabilistic seismic-hazard analysis (PSHA). Marzocchi and Jordan (2014) suggest that a clear and univocal taxonomy of uncertainties is not only of practical convenience, but it is of primary importance to meaningfully validate any probabilistic assessment and, consequently, to keep PSHA in a scientific domain (see Marzocchi and Jordan, 2014, for a discussion on commonalities and differences with the traditional view of PSHA practitioners; as exemplified in SSHAC, 1997). In particular, Marzocchi and Jordan (2014) show that aleatory variability and epistemic uncertainty can be separated only in the framework of a well-defined experimental concept.…”
Section: Introductionmentioning
confidence: 99%
“…1a,b), testing an ensemble forecast requires computing the forecast as the weighted sum of distribution functions, based on the law of total probability. As pointed out by Marzocchi and Jordan (2014), the result will be generally different from that based only on the mean hazard curve. When the abscissa of the aggregated hazard curve is expressed in return periods (e.g., Fig.…”
Section: Beyond the Mean Forecastmentioning
confidence: 89%