We present measurements of the stable carbon isotope ratio in air extracted from Antarctic ice core and firn samples. The same samples were previously used by Etheridge and co-workers to construct a high precision 1000-year record of atmospheric CO 2 concentration, featuring a close link between the ice and modern records and high-time resolution. Here, we start by confirming the trend in the Cape Grim in situ d13C record from 1982 to 1996, and extend it back to 1978 using the Cape Grim Air Archive. The firn air d13C agrees with the Cape Grim record, but only after correction for gravitational separation at depth, for diffusion effects associated with disequilibrium between the atmosphere and firm, and allowance for a latidudinal gradient in d13C between Cape Grim and the Antarctic coast. Complex calibration strategies are required to cope with several additional systematic influences on the ice core d13C record. Errors are assigned to each ice core value to reflect statistical and systematic biases (between ±0.025‰ and ±0.07‰); uncertainties (of up to ±0.05‰) between core-versus-core, ice-versus-firn and firn-versus-troposphere are described separately. An almost continuous atmospheric history of d13C over 1000 years results, exhibiting significant decadal-to-century scale variability unlike that from earlier proxy records. The decrease in d13C from 1860 to 1960 involves a series of steps confirming enhanced sensitivity of d13C to decadal timescale-forcing, compared to the CO 2 record. Synchronous with a ''Little Ice Age'' CO 2 decrease, an enhancement of d13C implies a terrestrial response to cooler temperatures. Between 1200 AD and 1600 AD, the atmospheric d13C appear stable.
Abstract. Atmospheric greenhouse gas (GHG) concentrations are at unprecedented, record-high levels compared to the last 800 000 years. Those elevated GHG concentrations warm the planet and – partially offset by net cooling effects by aerosols – are largely responsible for the observed warming over the past 150 years. An accurate representation of GHG concentrations is hence important to understand and model recent climate change. So far, community efforts to create composite datasets of GHG concentrations with seasonal and latitudinal information have focused on marine boundary layer conditions and recent trends since the 1980s. Here, we provide consolidated datasets of historical atmospheric concentrations (mole fractions) of 43 GHGs to be used in the Climate Model Intercomparison Project – Phase 6 (CMIP6) experiments. The presented datasets are based on AGAGE and NOAA networks, firn and ice core data, and archived air data, and a large set of published studies. In contrast to previous intercomparisons, the new datasets are latitudinally resolved and include seasonality. We focus on the period 1850–2014 for historical CMIP6 runs, but data are also provided for the last 2000 years. We provide consolidated datasets in various spatiotemporal resolutions for carbon dioxide (CO2), methane (CH4) and nitrous oxide (N2O), as well as 40 other GHGs, namely 17 ozone-depleting substances, 11 hydrofluorocarbons (HFCs), 9 perfluorocarbons (PFCs), sulfur hexafluoride (SF6), nitrogen trifluoride (NF3) and sulfuryl fluoride (SO2F2). In addition, we provide three equivalence species that aggregate concentrations of GHGs other than CO2, CH4 and N2O, weighted by their radiative forcing efficiencies. For the year 1850, which is used for pre-industrial control runs, we estimate annual global-mean surface concentrations of CO2 at 284.3 ppm, CH4 at 808.2 ppb and N2O at 273.0 ppb. The data are available at https://esgf-node.llnl.gov/search/input4mips/ and http://www.climatecollege.unimelb.edu.au/cmip6. While the minimum CMIP6 recommendation is to use the global- and annual-mean time series, modelling groups can also choose our monthly and latitudinally resolved concentrations, which imply a stronger radiative forcing in the Northern Hemisphere winter (due to the latitudinal gradient and seasonality).
We present measurements of the stable carbon isotope ratio in air extracted from Antarctic ice core and firn samples. The same samples were previously used by Etheridge and co‐workers to construct a high precision 1000‐year record of atmospheric CO2 concentration, featuring a close link between the ice and modern records and high‐time resolution. Here, we start by confirming the trend in the Cape Grim in situ δ13C record from 1982 to 1996, and extend it back to 1978 using the Cape Grim Air Archive. The firn air δ13C agrees with the Cape Grim record, but only after correction for gravitational separation at depth, for diffusion effects associated with disequilibrium between the atmosphere and firm, and allowance for a latidudinal gradient in δ13C between Cape Grim and the Antarctic coast. Complex calibration strategies are required to cope with several additional systematic influences on the ice core δ13C record. Errors are assigned to each ice core value to reflect statistical and systematic biases (between ± 0.025‰ and ± 0.07‰); uncertainties (of up to ± 0.05‰) between core‐versus‐core, ice‐versus‐firn and firn‐versus‐troposphere are described separately. An almost continuous atmospheric history of δ13C over 1000 years results, exhibiting significant decadal‐to‐century scale variability unlike that from earlier proxy records. The decrease in δ13C from 1860 to 1960 involves a series of steps confirming enhanced sensitivity of δ13C to decadal timescale‐forcing, compared to the CO2 record. Synchronous with a ‘‘Little Ice Age’′ CO2 decrease, an enhancement of δ13C implies a terrestrial response to cooler temperatures. Between 1200 AD and 1600 AD, the atmospheric δ13C appear stable.
Abstract. There is a growing consensus that land surface models (LSMs) that simulate terrestrial biosphere exchanges of matter and energy must be better constrained with data to quantify and address their uncertainties. FLUXNET, an international network of sites that measure the land surface exchanges of carbon, water and energy using the eddy covariance technique, is a prime source of data for model improvement. Here we outline a multi-stage process for "fusing" (i.e. linking) LSMs with FLUXNET data to generate better models with quantifiable uncertainty. First, we describe FLUXNET data availability, and its random and systematic biases. We then introduce methods for assessing LSM model runs against FLUXNET observations in temporal and spatial domains. These assessments are a prelude to more formal model-data fusion (MDF). MDF links model to data, based on error weightings. In theory, MDF produces optimal analyses of the modelled system, but there are practical problems. We first discuss how to set model errors and initial conditions. In both cases incorrect assumptions will affect the outcome of the MDF. We then review the problem of equifinality, whereby multiple combinations of parameters can produce similar model output. Fusing multiple independentCorrespondence to: M. Williams (mat.williams@ed.ac.uk) and orthogonal data provides a means to limit equifinality. We then show how parameter probability density functions (PDFs) from MDF can be used to interpret model validity, and to propagate errors into model outputs. Posterior parameter distributions are a useful way to assess the success of MDF, combined with a determination of whether model residuals are Gaussian. If the MDF scheme provides evidence for temporal variation in parameters, then that is indicative of a critical missing dynamic process. A comparison of parameter PDFs generated with the same model from multiple FLUXNET sites can provide insights into the concept and validity of plant functional types (PFT) -we would expect similar parameter estimates among sites sharing a single PFT. We conclude by identifying five major model-data fusion challenges for the FLUXNET and LSM communities: (1) to determine appropriate use of current data and to explore the information gained in using longer time series; (2) to avoid confounding effects of missing process representation on parameter estimation; (3) to assimilate more data types, including those from earth observation; (4) to fully quantify uncertainties arising from data bias, model structure, and initial conditions problems; and (5) to carefully test current model concepts (e.g. PFTs) and guide development of new concepts.Published by Copernicus Publications on behalf of the European Geosciences Union.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.