2013
DOI: 10.1137/130907550
|View full text |Cite
|
Sign up to set email alerts
|

Using Proper Divergence Functions to Evaluate Climate Models

Abstract: Abstract. It has been argued persuasively that, in order to evaluate climate models, the probability distributions of model output need to be compared to the corresponding empirical distributions of observed data. Distance measures between probability distributions, also called divergence functions, can be used for this purpose. We contend that divergence functions ought to be proper, in the sense that acting on modelers' true beliefs is an optimal strategy. The score divergences introduced in this paper deriv… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
59
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 66 publications
(66 citation statements)
references
References 28 publications
0
59
0
Order By: Relevance
“…For example, Candille and Talagrand (2008) use the quadratic divergence, ( f − g ) 2 , in the case of forecasting a binary event (also Santos and Ghelli, 2012), Pappenberger et al (2009) use the Kullback–Leibler divergence (or relative entropy), gnormallogfalse(gfalse/ffalse), and Friederichs and Thorarinsdottir (2012) propose the integrated quadratic distance, false(fgfalse)2. Thorarinsdottir et al (2013) list several other divergences, including the sub‐class of ‘score divergences’ that are formed from proper scoring rules, s , in the following way: d(f,g)=normalEy{s(f,y)}normalEy{s(g,y)}, where y ∼ g . Score divergences thus differ from expected (proper) scoring rules by subtracting an amount, E y { s ( g , y )}, which is independent of the forecast.…”
Section: Other Approaches To Observation Errormentioning
confidence: 99%
See 1 more Smart Citation
“…For example, Candille and Talagrand (2008) use the quadratic divergence, ( f − g ) 2 , in the case of forecasting a binary event (also Santos and Ghelli, 2012), Pappenberger et al (2009) use the Kullback–Leibler divergence (or relative entropy), gnormallogfalse(gfalse/ffalse), and Friederichs and Thorarinsdottir (2012) propose the integrated quadratic distance, false(fgfalse)2. Thorarinsdottir et al (2013) list several other divergences, including the sub‐class of ‘score divergences’ that are formed from proper scoring rules, s , in the following way: d(f,g)=normalEy{s(f,y)}normalEy{s(g,y)}, where y ∼ g . Score divergences thus differ from expected (proper) scoring rules by subtracting an amount, E y { s ( g , y )}, which is independent of the forecast.…”
Section: Other Approaches To Observation Errormentioning
confidence: 99%
“…Other authors measure the difference between f and g with a divergence, that is a function, d(f , g), for which d(g, g) = 0 and d(f , g) ≥ 0 for all f and g. For example, Candille and Talagrand (2008) use the quadratic divergence, (f − g) 2 , in the case of forecasting a binary event (also Santos and Ghelli, 2012), Pappenberger et al (2009) use the Kullback-Leibler divergence (or relative entropy), g log(g/f ), and Friederichs and Thorarinsdottir (2012) propose the integrated quadratic distance, (f − g) 2 . Thorarinsdottir et al (2013) list several other divergences, including the sub-class of 'score divergences' that are formed from proper scoring rules, s, in the following way:…”
Section: Probabilistic Observationsmentioning
confidence: 99%
“…The divergence d ( P,Q ) = S ( P,Q ) − S ( Q,Q ) is the non‐negative difference between the expected score and the entropy. Note that propriety of the score function implies propriety of the divergence (Thorarinsdottir et al 2014). d ( P,Q ) represents a measure for similarity between the probabilistic forecast P and the distribution Q , where smaller values indicate better correspondence.…”
Section: Proper Score Functionsmentioning
confidence: 99%
“…We compare two marginal distributions F and G using the integrated quadratic distance (IQD; Thorarinsdottir et al, 2013),…”
Section: Evaluation Methodsmentioning
confidence: 99%