The Intergovernmental Panel on Climate Change (IPCC) assesses information relevant to the understanding of climate change and explores options for adaptation and mitigation. The IPCC reports communicate uncertainty by using a set of probability terms accompanied by global interpretational guidelines. The judgment literature indicates that there are large differences in the way people understand such phrases, and that their use may lead to confusion and errors in communication. We conducted an experiment in which subjects read sentences from the 2007 IPCC report and assigned numerical values to the probability terms. The respondents' judgments deviated significantly from the IPCC guidelines, even when the respondents had access to these guidelines. These results suggest that the method used by the IPCC is likely to convey levels of imprecision that are too high. We propose an alternative form of communicating uncertainty, illustrate its effectiveness, and suggest several additional ways to improve the communication of uncertainty.
The Intergovernmental Panel on Climate Change (IPCC) publishes periodical assessment reports informing policymakers and the public on issues relevant to the understanding of human induced climate change. The IPCC uses a set of 7 verbal descriptions of uncertainty, such as unlikely and very likely to convey the underlying imprecision of its forecasts and conclusions. We report results of an experiment comparing the effectiveness of communication using these words and their numerical counterparts. We show that the public consistently misinterprets the probabilistic statements in the IPCC report in a regressive fashion, and that there are large individual differences in the interpretation of these statements, which are associated with the respondents' ideology and their views and beliefs about climate change issues. Most importantly our results suggest that using a dual (verbal-numerical) scale would be superior to the current mode of communication as it (a) increases the level of differentiation between the various terms, (b) increases the consistency of interpretation of these terms, and (c) increases the level of consistency with the IPCC guidelines. Most importantly, these positive effects are independent of the respondents' ideological and environmental views.
Effective translations between numerical and verbal representations of uncertainty are a concern shared by researchers in cognitive science and psychology, with applications to real-world risk management and decision support systems. While there is a substantial literature on such translations for point-wise probabilities, this paper contributes to the scanty literature on imprecise probability translations. Reanalysis of Budescu et al.'s [1] data on numerical interpretations of the Intergovernmental Panel on Climate Change [2] fourth report's verbal probability expressions (PEs) revealed that negative wording has deleterious effects on lay judgments. Budescu et al. asked participants to interpret PEs in IPCC report sentences, by asking them to provide lower, "best" and upper estimates of the probabilities that they thought the authors intended. There were four experimental conditions, determining whether participants were given any numerical guidelines for translating the PEs into numbers.The first analysis focuses on 12 sentences in Budescu et al. that used the PE "very likely," "likely," "unlikely," or "very unlikely". A mixed beta regression modelling the lower, "best" and upper estimates revealed a less regressive mean and less dispersion for positive than for negative wording in all three estimates, for both the "very likely" and "likely" sentence sets. The Budescu et al. data also included a task asking for context-free translations of these PEs, and a similar pattern of results was found for that task. Negative wording therefore resulted in more regressive estimates and less consensus regardless of experimental condition.The second analysis focuses on two statements that were positive-negative duals. Appropriate pairs of responses were assessed for conjugacy and additivity. A large majority of respondents were appropriately super-and sub-additive in their lower and upper probability estimates. A mixed beta regression model of these three variables revealed that respondents were suprisingly close to obeying the conjugacy relationships for lower and upper probabilities.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.