We present high‐resolution near‐term ensemble projections of hydroclimatic changes over the contiguous U.S. using a regional climate model (RegCM4) that dynamically downscales 11 global climate models from the fifth phase of Coupled Model Intercomparison Project at 18 km horizontal grid spacing. All model integrations span 41 years in the historical period (1965–2005) and 41 years in the near‐term future period (2010–2050) under Representative Concentration Pathway 8.5 and cover a domain that includes the contiguous U.S. and parts of Canada and Mexico. Should emissions continue to rise, surface temperatures in every region within the U.S. will reach a new climate norm well before mid 21st century regardless of the magnitudes of regional warming. Significant warming will likely intensify the regional hydrological cycle through the acceleration of the historical trends in cold, warm, and wet extremes. The future temperature response will be partly regulated by changes in snow hydrology over the regions that historically receive a major portion of cold season precipitation in the form of snow. Our results indicate the existence of the Clausius‐Clapeyron scaling at regional scales where per degree centigrade rise in surface temperature will lead to a 7.4% increase in precipitation from extremes. More importantly, both winter (snow) and summer (liquid) extremes are projected to increase across the U.S. These changes in precipitation characteristics will be driven by a shift toward shorter and wetter seasons. Overall, projected changes in the regional hydroclimate can have substantial impacts on the natural and human systems across the U.S.
Probable maximum precipitation (PMP), defined as the largest rainfall depth that could physically occur under a series of adverse atmospheric conditions, has been an important design criterion for critical infrastructures such as dams and nuclear power plants. To understand how PMP may respond to projected future climate forcings, we used a physics‐based numerical weather simulation model to estimate PMP across various durations and areas over the Alabama‐Coosa‐Tallapoosa (ACT) River Basin in the southeastern United States. Six sets of Weather Research and Forecasting (WRF) model experiments driven by both reanalysis and global climate model projections, with a total of 120 storms, were conducted. The depth‐area‐duration relationship was derived for each set of WRF simulations and compared with the conventional PMP estimates. Our results showed that PMP driven by projected future climate forcings is higher than 1981–2010 baseline values by around 20% in the 2021–2050 near‐future and 44% in the 2071–2100 far‐future periods. The additional sensitivity simulations of background air temperature warming also showed an enhancement of PMP, suggesting that atmospheric warming could be one important factor controlling the increase in PMP. In light of the projected increase in precipitation extremes under a warming environment, the reasonableness and role of PMP deserve more in‐depth examination.
Hydrologic predictions at rural watersheds are important but also challenging due to data shortage. Long Short-TermMemory (LSTM) networks are a promising machine learning approach and have demonstrated good performance in streamflow predictions. However, due to its data-hungry nature, most of LSTM applications focused on well-monitored catchments with abundant and high quality observations. In this work, we investigate predictive capabilities of LSTM in poorly monitored watersheds with short observation records. To address three main challenges of LSTM applications in data-scarce locations, i.e., overfitting, uncertainty quantification (UQ), and out-of-distribution prediction, we evaluate different regularization techniques to prevent overfitting, apply a Bayesian LSTM for UQ, and introduce a physics-informed hybrid LSTM to enhance out-of-distribution prediction. Through case studies in two diverse sets of catchments with and without snow influence, we demonstrate that: (1) when hydrologic variability in the prediction period is similar to the calibration period, LSTM models can reasonably predict daily streamflow with Nash-Sutcliffe efficiency above 0.8, even with only two years of calibration data. (2) When the hydrologic variability in the prediction and calibration periods is dramatically different, LSTM alone does not predict well, but the hybrid model can improve the out-of-distribution prediction with acceptable generalization accuracy. (3) L2 norm penalty and dropout can mitigate overfitting, and Bayesian and hybrid LSTM have no overfitting. (4) Bayesian LSTM provides useful uncertainty information to improve prediction understanding and credibility. These insights have vital implications for streamflow simulation in watersheds where data quality and availability are a critical issue.
Using the 2017 Hurricane Harvey flood event as a test case, this study set up a series of sensitivity analyses to highlight three challenges associated with large‐scale flood inundation modeling, including (a) model parameterization, (b) errors in digital elevation models, and (c) effects of reservoir retention. Driven by radar‐based hourly rainfall data, a series of hydrologic‐hydraulic models including the VIC hydrologic model, RAPID routing model, and Flood2D‐GPU hydrodynamic model are set up over Harris County, Texas, to simulate flood inundation and hazards. The results demonstrate the importance of hydrologic parameters in improving flood modeling. For a large flood event such as Hurricane Harvey, the effect of the initial water depths is insignificant. The Manning's n values may increase the peak water depth by ~1%, the flood extents by 65km2, and the high danger zone by ~6%. On the contrary, the bathymetry correction factors may reduce the flood extent by ~1.4% and the high‐danger zone by ~4%. Reducing the reservoir storage capacity to 1% may increase the flood extent by ~4% and the high‐danger zone by ~17%. This study may provide supporting information to guide and prioritize the development of future high‐performance computing hydrodynamic large‐scale flood simulations.
With likely increases in probable maximum precipitation (PMP) in a changing environment, critical infrastructures such as major reservoirs and nuclear power plants are subject to elevated risk. To understand how factors such as PMP variability, climate change, land use land cover (LULC) change, antecedent soil moisture conditions, and reservoir storage may individually or jointly affect the magnitude of probable maximum flood (PMF), we conducted integrated hydrometeorological simulations involving both the Weather Research Forecasting model and the distributed hydrologic model (DHSVM) over the Alabama‐Coosa‐Tallapoosa (ACT) River Basin in the southeastern United States. A total of 120 relative humidity‐maximized PMP storms under historic and projected future climate conditions were used to drive DHSVM in current and projected future LULC conditions. Overall, PMP and PMF are projected to increase significantly over the ACT River Basin. Sources of meteorological forcing data sets and climate change were found to be the most sensitive factors affecting PMF, followed by antecedent soil moisture, reservoir storage, and then LULC change. The ensemble of PMP and PMF simulations, along with their sensitivity, allows us to better quantify the potential risks associated with hydroclimatic extreme events to critical infrastructures for energy‐water security.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.