2011
DOI: 10.1111/j.1365-2427.2011.02592.x
|View full text |Cite
|
Sign up to set email alerts
|

Modelling everything everywhere: a new approach to decision‐making for water management under uncertainty

Abstract: Summary 1. There are increasing demands to predict ecohydrological responses to future changes in catchments but such predictions will be inevitably uncertain because of natural variability and different sources of knowledge (epistemic) uncertainty. 2. Policy setting and decision‐making should therefore reflect these inherent uncertainties in both model predictions and potential consequences. 3. This is the focus of a U.K. Natural Environment Research Council knowledge exchange project called the Catchment Cha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
88
0

Year Published

2012
2012
2017
2017

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 126 publications
(92 citation statements)
references
References 27 publications
(36 reference statements)
4
88
0
Order By: Relevance
“…Reasoning can be used to establish constraints and relational rules between parameters, in accordance with relevant organizing principles (this needs to be elaborated via future modeling research), and in accordance with a higher level (global or regional) water balance model. These latter two (conformity with organizing principles and water balance scheme) are particularly relevant when attempting to develop a community hydrological model (Weiler and Beven, 2015) or a hyper-resolution model of everywhere (Beven, 2007;Beven et al, 2015;Beven and Alcock, 2012). Such constraints and relational rules can either be applied manually or by some computer-based procedure (see Gharari et al, 2014;Vidal et al, 2007).…”
Section: Contrasting Parameter Calibration and Parameter Allocationmentioning
confidence: 99%
“…Reasoning can be used to establish constraints and relational rules between parameters, in accordance with relevant organizing principles (this needs to be elaborated via future modeling research), and in accordance with a higher level (global or regional) water balance model. These latter two (conformity with organizing principles and water balance scheme) are particularly relevant when attempting to develop a community hydrological model (Weiler and Beven, 2015) or a hyper-resolution model of everywhere (Beven, 2007;Beven et al, 2015;Beven and Alcock, 2012). Such constraints and relational rules can either be applied manually or by some computer-based procedure (see Gharari et al, 2014;Vidal et al, 2007).…”
Section: Contrasting Parameter Calibration and Parameter Allocationmentioning
confidence: 99%
“…These uncertainties are due to (1) data support, particularly with respect to precipitation, actual water use and land-surface characteristics; (2) water demand, supply and allocation algorithms, particularly with respect to irrigation demand estimation, reservoir operation and groundwater withdrawals; as well as (3) host large-scale models, particularly with respect to those calculations that determine surface-water and groundwater availability. It should be noted that here we only focus on epistemic sources of uncertainty, which needs to be addressed, quantified, communicated and possibly reduced (see Beven and Alcock, 2012). Table 3 summarizes various aspects of uncertainty related to data support, algorithmic procedures and host models, identified for estimation of water demand (see Nazemi and Wheater, 2015) as well as water supply and allocation (see Sects.…”
Section: Ideal Representation and Remaining Gapsmentioning
confidence: 99%
“…Moreover, population-based algorithms can provide methodological linkage to uncertainty assessment through various diagnostic tests. Guidelines are provided to test and falsify models through various evaluation criteria such as parametric identifiability (e.g., Beven, 2006b), Pareto optimality (Gupta et al, 1998), predictive uncertainty (Wagener et al, 2004) and limits of acceptability (Beven and Alcock, 2012).…”
Section: A Framework To Move Forwardmentioning
confidence: 99%
“…A benefit of using a logic tree, despite its simplicity, is the transparency in characterising epistemic uncertainties. In this regard, the logic tree approach is similar to the condition tree of analysis assumptions outlined by Beven and Alcock (2012). Nevertheless, difficulties arise because not all models, which analysts wish to apply, are based on consistent data or assumptions, and the probabilities of 5 alternatives in the logic tree are often poorly known, unknown, or unknowable (Bommer, 2012;Stein and Stein, 2013).…”
mentioning
confidence: 99%