Abstract. Projections of future climate change play a fundamental role in improving understanding of the climate system as well as characterizing societal risks and response options. The Scenario Model Intercomparison Project (ScenarioMIP) is the primary activity within Phase 6 of the Coupled Model Intercomparison Project (CMIP6) that will provide multi-model climate projections based on alternative scenarios of future emissions and land use changes produced with integrated assessment models. In this paper, we describe ScenarioMIP's objectives, experimental design, and its relation to other activities within CMIP6. The ScenarioMIP design is one component of a larger scenario process that aims to facilitate a wide range of integrated studies across the climate science, integrated assessment modeling, and impacts, adaptation, and vulnerability communities, and will form an important part of the evidence base in the forthcoming Intergovernmental Panel on Climate Change (IPCC) assessments. At the same time, it will provide the basis for investigating a number of targeted science and policy questions that are especially relevant to scenario-based analysis, including the role of specific forcings such as land use and aerosols, the effect of a peak and decline in forcing, the consequences of scenarios that limit warming to below 2 • C, the relative contributions to uncertainty from scenarios, climate models, and internal variability, and long-term climate system outcomes beyond the 21st century. To serve this wide range of scientific communities and address these questions, a design has been identified consisting of eight alternative 21st century scenarios plus one large initial condition ensemble and a set of long-term extensions, divided into two tiers defined by relative priority. Some of these scenarios will also provide a basis for variants planned to be run in other CMIP6-Endorsed MIPs to investigate questions related to specific forcings. Harmonized, spatially explicit emissions and land use scenarios generated with integrated assessment models will be provided to participating climate modeling groups by late 2016, with the climate model simulations run within the 2017-2018 time frame, and output from the climate modelPublished by Copernicus Publications on behalf of the European Geosciences Union.
The severity of damaging human-induced climate change depends not only on the magnitude of the change but also on the potential for irreversibility. This paper shows that the climate change that takes place due to increases in carbon dioxide concentration is largely irreversible for 1,000 years after emissions stop. Following cessation of emissions, removal of atmospheric carbon dioxide decreases radiative forcing, but is largely compensated by slower loss of heat to the ocean, so that atmospheric temperatures do not drop significantly for at least 1,000 years. Among illustrative irreversible impacts that should be expected if atmospheric carbon dioxide concentrations increase from current levels near 385 parts per million by volume (ppmv) to a peak of 450 -600 ppmv over the coming century are irreversible dry-season rainfall reductions in several regions comparable to those of the ''dust bowl'' era and inexorable sea level rise. Thermal expansion of the warming ocean provides a conservative lower limit to irreversible global average sea level rise of at least 0.4 -1.0 m if 21st century CO 2 concentrations exceed 600 ppmv and 0.6 -1.9 m for peak CO 2 concentrations exceeding Ϸ1,000 ppmv. Additional contributions from glaciers and ice sheet contributions to future sea level rise are uncertain but may equal or exceed several meters over the next millennium or longer.dangerous interference ͉ precipitation ͉ sea level rise ͉ warming
More than 100 countries have adopted a global warming limit of 2 degrees C or below (relative to pre-industrial levels) as a guiding principle for mitigation efforts to reduce climate change risks, impacts and damages. However, the greenhouse gas (GHG) emissions corresponding to a specified maximum warming are poorly known owing to uncertainties in the carbon cycle and the climate response. Here we provide a comprehensive probabilistic analysis aimed at quantifying GHG emission budgets for the 2000-50 period that would limit warming throughout the twenty-first century to below 2 degrees C, based on a combination of published distributions of climate system properties and observational constraints. We show that, for the chosen class of emission scenarios, both cumulative emissions up to 2050 and emission levels in 2050 are robust indicators of the probability that twenty-first century warming will not exceed 2 degrees C relative to pre-industrial temperatures. Limiting cumulative CO(2) emissions over 2000-50 to 1,000 Gt CO(2) yields a 25% probability of warming exceeding 2 degrees C-and a limit of 1,440 Gt CO(2) yields a 50% probability-given a representative estimate of the distribution of climate system properties. As known 2000-06 CO(2) emissions were approximately 234 Gt CO(2), less than half the proven economically recoverable oil, gas and coal reserves can still be emitted up to 2050 to achieve such a goal. Recent G8 Communiqués envisage halved global GHG emissions by 2050, for which we estimate a 12-45% probability of exceeding 2 degrees C-assuming 1990 as emission base year and a range of published climate sensitivity distributions. Emissions levels in 2020 are a less robust indicator, but for the scenarios considered, the probability of exceeding 2 degrees C rises to 53-87% if global GHG emissions are still more than 25% above 2000 levels in 2020.
Recent coordinated efforts, in which numerous climate models have been run for a common set of experiments, have produced large datasets of projections of future climate for various scenarios. Those multi-model ensembles sample initial condition, parameter as well as structural uncertainties in the model design, and they have prompted a variety of approaches to quantify uncertainty in future climate in a probabilistic way. This paper outlines the motivation for using multi-model ensembles, reviews the methodologies published so far and compares their results for regional temperature projections. The challenges in interpreting multi-model results, caused by the lack of verification of climate projections, the problem of model dependence, bias and tuning as well as the difficulty in making sense of an 'ensemble of opportunity', are discussed in detail.
Recent coordinated efforts, in which numerous general circulation climate models have been run for a common set of experiments, have produced large datasets of projections of future climate for various scenarios. Those multimodel ensembles sample initial conditions, parameters, and structural uncertainties in the model design, and they have prompted a variety of approaches to quantifying uncertainty in future climate change. International climate change assessments also rely heavily on these models. These assessments often provide equal-weighted averages as best-guess results, assuming that individual model biases will at least partly cancel and that a model average prediction is more likely to be correct than a prediction from a single model based on the result that a multimodel average of present-day climate generally outperforms any individual model. This study outlines the motivation for using multimodel ensembles and discusses various challenges in interpreting them. Among these challenges are that the number of models in these ensembles is usually small, their distribution in the model or parameter space is unclear, and that extreme behavior is often not sampled. Model skill in simulating present-day climate conditions is shown to relate only weakly to the magnitude of predicted change. It is thus unclear by how much the confidence in future projections should increase based on improvements in simulating present-day conditions, a reduction of intermodel spread, or a larger number of models. Averaging model output may further lead to a loss of signal—for example, for precipitation change where the predicted changes are spatially heterogeneous, such that the true expected change is very likely to be larger than suggested by a model average. Last, there is little agreement on metrics to separate “good” and “bad” models, and there is concern that model development, evaluation, and posterior weighting or ranking are all using the same datasets. While the multimodel average appears to still be useful in some situations, these results show that more quantitative methods to evaluate model performance are critical to maximize the value of climate change projections from global models.
In the context of phase 5 of the Coupled Model Intercomparison Project, most climate simulations use prescribed atmospheric CO 2 concentration and therefore do not interactively include the effect of carbon cycle feedbacks. However, the representative concentration pathway 8.5 (RCP8.5) scenario has additionally been run by earth system models with prescribed CO 2 emissions. This paper analyzes the climate projections of 11 earth system models (ESMs) that performed both emission-driven and concentration-driven RCP8.5 simulations. When forced by RCP8.5 CO 2 emissions, models simulate a large spread in atmospheric CO 2 ; the simulated 2100 concentrations range between 795 and 1145 ppm. Seven out of the 11 ESMs simulate a larger CO 2 (on average by 44 ppm, 985 6 97 ppm by 2100) and hence higher radiative forcing (by 0.25 W m 22 ) when driven by CO 2 emissions than for the concentration-driven scenarios (941 ppm). However, most of these models already overestimate the present-day CO 2 , with the present-day biases reasonably well correlated with future atmospheric concentrations' departure from the prescribed concentration. The uncertainty in CO 2 projections is mainly attributable to uncertainties in the response of the land carbon cycle. As a result of simulated higher CO 2 concentrations than in the concentration-driven simulations, temperature projections are generally higher when ESMs are driven with CO 2 emissions. Global surface temperature change by 2100 (relative to present day) increased by 3.98 6 0.98C for the emission-driven simulations compared to 3.78 6 0.78C in the concentration-driven simulations. Although the lower ends are comparable in both sets of simulations, the highest climate projections are significantly warmer in the emission-driven simulations because of stronger carbon cycle feedbacks.
We assess evidence relevant to Earth's equilibrium climate sensitivity per doubling of atmospheric CO 2 , characterized by an effective sensitivity S. This evidence includes feedback process understanding, the historical climate record, and the paleoclimate record. An S value lower than 2 K is difficult to reconcile with any of the three lines of evidence. The amount of cooling during the Last Glacial Maximum provides strong evidence against values of S greater than 4.5 K. Other lines of evidence in combination also show that this is relatively unlikely. We use a Bayesian approach to produce a probability density function (PDF) for S given all the evidence, including tests of robustness to difficult-to-quantify uncertainties and different priors. The 66% range is 2.6-3.9 K for our Baseline calculation and remains within 2.3-4.5 K under the robustness tests; corresponding 5-95% ranges are 2.3-4.7 K, bounded by 2.0-5.7 K (although such high-confidence ranges should be regarded more cautiously). This indicates a stronger constraint on S than reported in past assessments, by lifting the low end of the range. This narrowing occurs because the three lines of evidence agree and are judged to be largely independent and because of greater confidence in understanding feedback processes and in combining evidence. We identify promising avenues for further narrowing the range in S, in particular using comprehensive models and process understanding to address limitations in the traditional forcing-feedback paradigm for interpreting past changes. Plain Language Summary Earth's global "climate sensitivity" is a fundamental quantitative measure of the susceptibility of Earth's climate to human influence. A landmark report in 1979 concluded that it probably lies between 1.5°C and 4.5°C per doubling of atmospheric carbon dioxide, assuming that other influences on climate remain unchanged. In the 40 years since, it has appeared difficult to reduce this uncertainty range. In this report we thoroughly assess all lines of evidence including some new developments. We find that a large volume of consistent evidence now points to a more confident view of a climate sensitivity near the middle or upper part of this range. In particular, it now appears extremely unlikely that the climate sensitivity could be low enough to avoid substantial climate change (well in excess of 2°C warming) under a high-emission future scenario. We remain unable to rule out that the sensitivity could be above 4.5°C per doubling of carbon dioxide levels, although this is not likely. Continued ©2020. American Geophysical Union. All Rights Reserved.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.