Palaeoclimatology provides our only means of assessing climatic variations before the beginning of instrumental records. The various proxy variables used, however, have a number of limitations which must be adequately addressed and understood. Besides their obvious spatial and seasonal limitations, different proxies are also potentially limited in their ability to represent climatic variations over a range of different timescales. Simple correlations with instrumental data over the period since ad 1881 give some guide to which are the better proxies, indicating that coral- and ice-core-based reconstructions are poorer than tree-ring and historical ones. However, the quality of many proxy time series can deteriorate during earlier times. Suggestions are made for assessing proxy quality over longer periods than the last century by intercomparing neighbouring proxies and, by comparisons with less temporally resolved proxies such as borehole temperatures. We have averaged 17 temperature reconstructions (representing various seasons of the year), all extending back at least to the mid-seventeenth century, to form two annually resolved hemispheric series (NH10 and SH7). Over the 1901–91 period, NH10 has 36% variance in common with average NH summer (June to August) temperatures and 70% on decadal timescales. SH7 has 16% variance in common with average SH summer (December to February) temperatures and 49% on decadal timescales, markedly poorer than the reconstructed NH series. The coldest year of the millennium over the NH is ad 1601, the coldest decade 1691–1700 and the seventeenth is the coldest century. A Principal Components Analysis (PCA) is performed on yearly values for the 17 reconstructions over the period ad 1660–1970. The correlation between PC1 and NH10 is 0.92, even though PC1 explains only 13.6% of the total variance of all 17 series. Similar PCA is performed on thousand-year-long General Circulation Model (GCM) data from the Geophysical Fluid Dynamics Laboratory (GFDL) and the Hadley Centre (HADCM2), sampling these for the same locations and seasons as the proxy data. For GFDL, the correlation between its PC1 and its NH10 is 0.89, while for HADCM2 the PCs group markedly differently. Cross-spectral analyses are performed on the proxy data and the GFDL model data at two different frequency bands (0.02 and 0.03 cycles per year). Both analyses suggest that there is no large-scale coherency in the series on these timescales. This implies that if the proxy data are meaningful, it should be relatively straightforward to detect a coherent near-global anthropogenic signal in surface temperature data.
As climate change research becomes increasingly applied, the need for actionable information is growing rapidly. A key aspect of this requirement is the representation of uncertainties. The conventional approach to representing uncertainty in physical aspects of climate change is probabilistic, based on ensembles of climate model simulations. In the face of deep uncertainties, the known limitations of this approach are becoming increasingly apparent. An alternative is thus emerging which may be called a ‘storyline’ approach. We define a storyline as a physically self-consistent unfolding of past events, or of plausible future events or pathways. No a priori probability of the storyline is assessed; emphasis is placed instead on understanding the driving factors involved, and the plausibility of those factors. We introduce a typology of four reasons for using storylines to represent uncertainty in physical aspects of climate change: (i) improving risk awareness by framing risk in an event-oriented rather than a probabilistic manner, which corresponds more directly to how people perceive and respond to risk; (ii) strengthening decision-making by allowing one to work backward from a particular vulnerability or decision point, combining climate change information with other relevant factors to address compound risk and develop appropriate stress tests; (iii) providing a physical basis for partitioning uncertainty, thereby allowing the use of more credible regional models in a conditioned manner and (iv) exploring the boundaries of plausibility, thereby guarding against false precision and surprise. Storylines also offer a powerful way of linking physical with human aspects of climate change.
We analyse possible causes of twentieth century near-surface temperature change. We use an``optimal detection'' methodology to compare seasonal and annual data from the coupled atmosphere-ocean general circulation model HadCM2 with observations averaged over a range of spatial and temporal scales. The results indicate that the increases in temperature observed in the latter half of the century have been caused by warming from anthropogenic increases in greenhouse gases oset by cooling from tropospheric sulfate aerosols rather than natural variability, either internal or externally forced. We also ®nd that greenhouse gases are likely to have contributed signi®cantly to the warming in the ®rst half of the century. In addition, natural eects may have contributed to this warming. Assuming one particular reconstruction of total solar irradiance to be correct implies, when we take the seasonal cycle into account, that solar eects have contributed signi®cantly to the warming observed in the early part of the century, regardless of any relative error in the amplitudes of the anthropogenic forcings prescribed in the model. However, this is not the case with an alternative reconstruction of total solar irradiance, based more on the amplitude than the length of the solar cycle. We also ®nd evidence for volcanic in¯uences on twentieth century near-surface temperatures. The signature of the eruption of Mount Pinatubo is detected using annual-mean data. We also ®nd evidence for a volcanic in¯uence on warming in the ®rst half of the century associated with a reduction in mid-century volcanism.
This paper addresses the question of where we now stand with respect to detection and attribution of an anthropo-genic climate signal. Our ability to estimate natural climate variability, against which claims of anthropogenic signal detection must be made, is reviewed. The current situation suggests control runs of global climate models may give the best estimates of natural variability on a global basis, estimates that appear to be accurate to within a factor of 2 or 3 at multidecadal timescales used in detection work. Present uncertainties in both observations and model-simulated anthropogenic signals in near-surface air temperature are estimated. The uncertainty in model simulated signals is, in places, as large as the signal to be detected. Two different, but complementary, approaches to detection and attribution are discussed in the context of these uncertainties. Applying one of the detection strategies, it is found that the change in near-surface, June through August air temperature field over the last 50 years is generally different at a significance level of 5% from that expected from model-based estimates of natural variability. Greenhouse gases alone cannot explain the observed change. Two of four climate models forced by greenhouse gases and direct sulfate aerosols produce results consistent with the current climate change observations, while the consistency of the other two depends on which model's anthropogenic fingerprints are used. A recent integration with additional anthropogenic forcings (the indirect effects of sulfate aerosols and tropospheric ozone) and more complete tropospheric chemistry produced results whose signal amplitude and pattern were consistent with current observations, provided the model's fingerprint is used and detection carried out over only the last 30 years of annually averaged data. This single integration currently cannot be corroborated and provides no opportunity to estimate the uncertainties inherent in the results, uncertainties that are thought to be large and poorly known. These results illustrate the current large uncertainty in the magnitude and spatial pattern of the direct and indirect sulfate forcing and climate response. They also show detection statements depend on model-specific fingerprints, time period , and seasonal character of the signal, dependencies that have not been well explored. Most, but not all, results suggest that recent changes in global climate inferred from surface air temperature are likely not due solely to natural causes. At present it is not possible to make a very confident statement about the relative contributions of specific natural and anthropogenic forcings to observed climate change. One of the main reasons is that fully realistic simulations of climate change due to the combined effects of all anthropogenic and natural forcings mechanisms have yet to be computed. A list of recommendations for reducing some of the uncertainties that currently hamper detection and attribution studies is presented.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.