Historical reanalyses that span more than a century are needed for a wide range of studies, from understanding large‐scale climate trends to diagnosing the impacts of individual historical extreme weather events. The Twentieth Century Reanalysis (20CR) Project is an effort to fill this need. It is supported by the National Oceanic and Atmospheric Administration (NOAA), the Cooperative Institute for Research in Environmental Sciences (CIRES), and the U.S. Department of Energy (DOE), and is facilitated by collaboration with the international Atmospheric Circulation Reconstructions over the Earth initiative. 20CR is the first ensemble of sub‐daily global atmospheric conditions spanning over 100 years. This provides a best estimate of the weather at any given place and time as well as an estimate of its confidence and uncertainty. While extremely useful, version 2c of this dataset (20CRv2c) has several significant issues, including inaccurate estimates of confidence and a global sea level pressure bias in the mid‐19th century. These and other issues can reduce its effectiveness for studies at many spatial and temporal scales. Therefore, the 20CR system underwent a series of developments to generate a significant new version of the reanalysis. The version 3 system (NOAA‐CIRES‐DOE 20CRv3) uses upgraded data assimilation methods including an adaptive inflation algorithm; has a newer, higher‐resolution forecast model that specifies dry air mass; and assimilates a larger set of pressure observations. These changes have improved the ensemble‐based estimates of confidence, removed spin‐up effects in the precipitation fields, and diminished the sea‐level pressure bias. Other improvements include more accurate representations of storm intensity, smaller errors, and large‐scale reductions in model bias. The 20CRv3 system is comprehensively reviewed, focusing on the aspects that have ameliorated issues in 20CRv2c. Despite the many improvements, some challenges remain, including a systematic bias in tropical precipitation and time‐varying biases in southern high‐latitude pressure fields.
The performance of a new historical reanalysis, the NOAA-CIRES-DOE 20th Century Reanalysis Version 3 (20CRv3), is evaluated via comparisons with other reanalyses and independent observations. This dataset provides global, 3-hourly estimates of the atmosphere from 1806 to 2015 by assimilating only surface pressure observations and prescribing sea surface temperature, sea ice concentration, and radiative forcings. Comparisons with independent observations, other reanalyses, and satellite products suggest that 20CRv3 can reliably produce atmospheric estimates on scales ranging from weather events to long-term climatic trends. Not only does 20CRv3 recreate a “best estimate” of the weather, including extreme events, it also provides an estimate of its confidence through the use of an ensemble. Surface pressure statistics suggest that these confidence estimates are reliable. Comparisons with independent upper-air observations in the Northern Hemisphere demonstrate that 20CRv3 has skill throughout the 20th century. Upper-air fields from 20CRv3 in the late 20th century and early 21st century correlate well with full-input reanalyses, and the correlation is predicted by the confidence fields from 20CRv3. The skill of analyzed 500hPa geopotential heights from 20CRv3 for 1979-2015 is comparable to that of modern operational 3- to 4-day forecasts. Finally, 20CRv3 performs well on climate timescales. Long time series and multidecadal averages of mass, circulation, and precipitation fields agree well with modern reanalyses and station- and satellite-based products. 20CRv3 is also able to capture trends in tropospheric layer temperatures that correlate well with independent products in the 20th century, placing recent trends in a longer historical context.
Observations are the foundation for understanding the climate system. Yet, currently available land meteorological data are highly fractured into various global, regional, and national holdings for different variables and time scales, from a variety of sources, and in a mixture of formats. Added to this, many data are still inaccessible for analysis and usage. To meet modern scientific and societal demands as well as emerging needs such as the provision of climate services, it is essential that we improve the management and curation of available land-based meteorological holdings. We need a comprehensive global set of data holdings, of known provenance, that is truly integrated both across essential climate variables (ECVs) and across time scales to meet the broad range of stakeholder needs. These holdings must be easily discoverable, made available in accessible formats, and backed up by multitiered user support. The present paper provides a high-level overview, based upon broad community input, of the steps that are required to bring about this integration. The significant challenge is to find a sustained means to realize this vision. This requires a long-term international program. The database that results will transform our collective ability to provide societally relevant research, analysis, and predictions in many weather- and climate-related application areas across much of the globe.
Lagrangian measurements from passive ocean instruments provide a useful source of data for estimating and forecasting the ocean's state (velocity field, salinity field, etc.). However, trajectories from these instruments are often highly nonlinear, leading to difficulties with widely used data assimilation algorithms such as the ensemble Kalman filter (EnKF). Additionally, the velocity field is often modeled as a high-dimensional variable, which precludes the use of more accurate methods such as the particle filter (PF). Here, a hybrid particle-ensemble Kalman filter is developed that applies the EnKF update to the potentially highdimensional velocity variables, and the PF update to the relatively low-dimensional, highly nonlinear drifter position variable. This algorithm is tested with twin experiments on the linear shallow water equations. In experiments with infrequent observations, the hybrid filter consistently outperformed the EnKF, both by better capturing the Bayesian posterior and by better tracking the truth.
Four state-of-the-art satellite-based estimates of ocean surface latent heat fluxes (LHFs) extending over three decades are analyzed, focusing on the interannual variability and trends of near-global averages and regional patterns. Detailed inter-comparisons are made with other datasets including: (i) reduced observation reanalyses (RedObs) whose exclusion of satellite data renders them an important independent diagnostic tool; (ii) a moisture budget residual LHF estimate using reanalysis moisture transport, atmospheric storage and satellite precipitation; (iii) the ECMWF Reanalysis 5 (ERA5); (iv) Remote Sensing Systems (RSS) single-sensor passive microwave and scatterometer wind speed retrievals, and (v) several sea-surface temperature (SST) datasets. Large disparities remain in near-global satellite LHF trends and their regional expression over the 1990-2010 period, during which time the Interdecadal Pacific Oscillation changed sign. The budget residual diagnostics support the smaller RedObs LHF trends. The satellites, ERA5 and RedObs are reasonably consistent in identifying contributions by the 10m wind speed variations to the LHF trend patterns. However, contributions by the near-surface vertical humidity gradient from satellites and ERA5 trend upward in time with respect to the RedObs ensemble and show less agreement in trend patterns. Problems with wind speed retrievals from Special Sensor Microwave Imager / Sounder satellite sensors, excessive upward trends in trends in Optimal Interpolation Sea Surface Temperature (OISST AVHRR-Only) data used in most satellite LHF estimates and uncertainties associated with poor satellite coverage before the mid-1990s are noted. Possibly erroneous trends are also identified in ERA5 LHF associated with the onset of scatterometer wind data assimilation in the early 1990s.
Numerical models of ocean circulation often depend on parameters that must be tuned to match either results from laboratory experiments or field observations. This study demonstrates that an initial, suboptimal estimate of a parameter in a model of a small bay can be improved by assimilating observations of trajectories of passive drifters. The parameter of interest is the Manning's n coefficient of friction in a small inlet of the bay, which had been tuned to match velocity observations from 2011. In 2013, the geometry of the inlet had changed, and the friction parameter was no longer optimal. Results from synthetic experiments demonstrate that assimilation of drifter trajectories improves the estimate of n, both when the drifters are located in the same region as the parameter of interest and when the drifters are located in a different region of the bay. Real drifter trajectories from field experiments in 2013 also are assimilated, and results are compared with velocity observations. When the real drifters are located away from the region of interest, the results depend on the time interval (with respect to the full available trajectories) over which assimilation is performed. When the drifters are in the same region as the parameter of interest, the value of n estimated with assimilation yields improved estimates of velocity throughout the bay.
Particle filtering methods for data assimilation may suffer from the ''curse of dimensionality,'' where the required ensemble size grows rapidly as the dimension increases. It would, therefore, be useful to know a priori whether a particle filter is feasible to implement in a given system. Previous work provides an asymptotic relation between the necessary ensemble size and an exponential function of t 2 , a statistic that depends on observationspace quantities and that is related to the system dimension when the number of observations is large; for linear, Gaussian systems, the statistic t 2 can be computed from eigenvalues of an appropriately normalized covariance matrix. Tests with a low-dimensional system show that these asymptotic results remain useful when the system is nonlinear, with either the standard or optimal proposal implementation of the particle filter. This study explores approximations to the covariance matrices that facilitate computation in high-dimensional systems, as well as different methods to estimate the accumulated system noise covariance for the optimal proposal. Since t 2 may be approximated using an ensemble from a simpler data assimilation scheme, such as the ensemble Kalman filter, the asymptotic relations thus allow an estimate of the ensemble size required for a particle filter before its implementation. Finally, the improved performance of particle filters with the optimal proposal, relative to those using the standard proposal, in the same low-dimensional system is demonstrated.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.