This study reports a new and significantly enhanced analysis of US flood hazard at 30 m spatial resolution. Specific improvements include updated hydrography data, new methods to determine channel depth, more rigorous flood frequency analysis, output downscaling to property tract level, and inclusion of the impact of local interventions in the flooding system. For the first time, we consider pluvial, fluvial, and coastal flood hazards within the same framework and provide projections for both current (rather than historic average) conditions and for future time periods centered on 2035 and 2050 under the RCP4.5 emissions pathway. Validation against high-quality local models and the entire catalog of FEMA 1% annual probability flood maps yielded Critical Success Index values in the range 0.69-0.82. Significant improvements over a previous pluvial/fluvial model version are shown for high-frequency events and coastal zones, along with minor improvements in areas where model performance was already good. The result is the first comprehensive and consistent national-scale analysis of flood hazard for the conterminous US for both current and future conditions. Even though we consider a stabilization emissions scenario and a near-future time horizon, we project clear patterns of changing flood hazard (3σ changes in 100 years inundated area of −3.8 to +16% at 1° scale), that are significant when considered as a proportion of the land area where human use is possible or in terms of the currently protected land area where the standard of flood defense protection may become compromised by this time. Plain Language Summary We develop a method to estimate past, present, and future flood risk for all properties in the conterminous United States whether affected by river, coastal or rainfall flooding. The analysis accounts for variability within environmental factors including changes in sea level rise, hurricane intensity and landfall locations, precipitation patterns, and river discharge. We show that even for a conservative climate change trajectory we can expect locally significant changes in the land area at risk from floods by 2050, and by this time defenses protecting 2,200 km 2 of land may be compromised. The complete dataset has been made available via a website (https://floodfactor.com/) created by the First Street Foundation in order to increase public awareness of the threat posed by flooding to safety and livelihoods. BATES ET AL.
Abstract:Advances in remote sensing have enabled hydraulic models to run at fine scale resolutions, producing precise flood inundation predictions. However, running models at finer resolutions increase their computational expense, reducing the feasibility of running the multiple model realizations required to undertake uncertainty analysis. Furthermore, it is possible that precision gained by running fine scale models is smoothed out when treating models probabilistically. The aim of this paper is to determine the level of spatial complexity that is required when making probabilistic flood inundation predictions. The Imera basin, Sicily is used as a case study to assess how changing the spatial resolution of the hydraulic model LISFLOOD-FP impacts on the skill of conditional probabilistic flood inundation maps given model parameter and boundary condition uncertainties. We find that model performance deteriorates at resolutions coarser than 50 m. This is predominantly caused by changes in flow pathways at coarser resolutions which lead to non-stationarity in the optimum model parameters at different spatial resolutions. However, although it is still possible to produce probabilistic flood maps that contain a coherent outline of the flood extent at coarser resolutions, the reliability of these maps deteriorates at resolutions coarser than 100 m. Additionally, although the rejection of non-behavioural models reduces the uncertainty in probabilistic flood maps the reliability of these maps is also reduced. Models with resolutions finer than 50 m offer little gain in performance yet are more than an order of magnitude computationally expensive which can become infeasible when undertaking probabilistic analysis. Furthermore, we show that using deterministic, high-resolution flood maps can lead to a spurious precision that would be misleading and not representative of the overall uncertainties that are inherent in making inundation predictions.
Where high-resolution topographic data are available, modelers are faced with the decision of whether it is better to spend computational resource on resolving topography at finer resolutions or on running more simulations to account for various uncertain input factors (e.g., model parameters). In this paper we apply global sensitivity analysis to explore how influential the choice of spatial resolution is when compared to uncertainties in the Manning's friction coefficient parameters, the inflow hydrograph, and those stemming from the coarsening of topographic data used to produce Digital Elevation Models (DEMs). We apply the hydraulic model LISFLOOD-FP to produce several temporally and spatially variable model outputs that represent different aspects of flood inundation processes, including flood extent, water depth, and time of inundation. We find that the most influential input factor for flood extent predictions changes during the flood event, starting with the inflow hydrograph during the rising limb before switching to the channel friction parameter during peak flood inundation, and finally to the floodplain friction parameter during the drying phase of the flood event. Spatial resolution and uncertainty introduced by resampling topographic data to coarser resolutions are much more important for water depth predictions, which are also sensitive to different input factors spatially and temporally. Our findings indicate that the sensitivity of LISFLOOD-FP predictions is more complex than previously thought. Consequently, the input factors that modelers should prioritize will differ depending on the model output assessed, and the location and time of when and where this output is most relevant.
Elevation data are fundamental to many applications, especially in geosciences. The latest global elevation data contains forest and building artifacts that limit its usefulness for applications that require precise terrain heights, in particular flood simulation. Here, we use machine learning to remove buildings and forests from the Copernicus Digital Elevation Model to produce, for the first time, a global map of elevation with buildings and forests removed at 1 arc second (∽30m) grid spacing. We train our correction algorithm on a unique set of reference elevation data from 12 countries, covering a wide range of climate zones and urban extents. Hence, this approach has much wider applicability compared to previous DEMs trained on data from a single country. Our method reduces mean absolute vertical error in built-up areas from 1.61m to 1.12m, and in forests from 5.15m to 2.88m. The new elevation map is more accurate than existing global elevation maps and will strengthen applications and models where high quality global terrain information is required.
Abstract. The understanding of the nature and behavior of ice sheets in past warm periods is important for constraining the potential impacts of future climate change. The Pliocene warm period (between 3.264 and 3.025 Ma) saw global temperatures similar to those projected for future climates; nevertheless, Pliocene ice locations and extents are still poorly constrained. We present results from the efforts to simulate mid-Pliocene Greenland Ice Sheets by means of the international Pliocene Ice Sheet Modeling Intercomparison Project (PLISMIP). We compare the performance of existing numerical ice sheet models in simulating modern control and mid-Pliocene ice sheets with a suite of sensitivity experiments guided by available proxy records. We quantify equilibrated ice sheet volume on Greenland, identifying a potential range in sea level contributions from warm Pliocene scenarios. A series of statistical measures are performed to quantify the confidence of simulations with focus on intermodel and inter-scenario differences. We find that Pliocene Greenland Ice Sheets are less sensitive to differences in ice sheet model configurations and internal physical quantities than to changes in imposed climate forcing. We conclude that Pliocene ice was most likely to be limited to the highest elevations in eastern and southern Greenland as simulated with the highest confidence and by synthesizing available regional proxies; however, the extent of those ice caps needs to be further constrained by using a range of general circulation model (GCM) climate forcings.
Abstract. We present a transparent and validated climate-conditioned catastrophe flood model for the UK, that simulates pluvial, fluvial and coastal flood risks at 1 arcsec spatial resolution (∼ 20–25 m). Hazard layers for 10 different return periods are produced over the whole UK for historic, 2020, 2030, 2050 and 2070 conditions using the UK Climate Projections 2018 (UKCP18) climate simulations. From these, monetary losses are computed for five specific global warming levels above pre-industrial values (0.6, 1.1, 1.8, 2.5 and 3.3 ∘C). The analysis contains a greater level of detail and nuance compared to previous work, and represents our current best understanding of the UK's changing flood risk landscape. Validation against historical national return period flood maps yielded critical success index values of 0.65 and 0.76 for England and Wales, respectively, and maximum water levels for the Carlisle 2005 flood were replicated to a root mean square error (RMSE) of 0.41 m without calibration. This level of skill is similar to local modelling with site-specific data. Expected annual damage in 2020 was GBP 730 million, which compares favourably to the observed value of GBP 714 million reported by the Association of British Insurers. Previous UK flood loss estimates based on government data are ∼ 3× higher, and lie well outside our modelled loss distribution, which is plausibly centred on the observations. We estimate that UK 1 % annual probability flood losses were ∼ 6 % greater for the average climate conditions of 2020 (∼ 1.1 ∘C of warming) compared to those of 1990 (∼ 0.6 ∘C of warming), and this increase can be kept to around ∼ 8 % if all countries' COP26 2030 carbon emission reduction pledges and “net zero” commitments are implemented in full. Implementing only the COP26 pledges increases UK 1 % annual probability flood losses by 23 % above average 1990 values, and potentially 37 % in a “worst case” scenario where carbon reduction targets are missed and climate sensitivity is high.
This study reports a new and significantly enhanced analysis of US flood hazard at 30 m spatial resolution. Specific improvements include updated hydrography data, new methods to determine channel depth, more rigorous flood frequency analysis, output downscaling to property tract level, and inclusion of the impact of local interventions in the flooding system. For the first time, we consider pluvial, fluvial, and coastal flood hazards within the same framework and provide projections for both current (rather than historic average) conditions and for future time periods centered on 2035 and 2050 under the RCP4.5 emissions pathway. Validation against high-quality local models and the entire catalog of FEMA 1% annual probability flood maps yielded Critical Success Index values in the range 0.69-0.82. Significant improvements over a previous pluvial/fluvial model version are shown for high-frequency events and coastal zones, along with minor improvements in areas where model performance was already good. The result is the first comprehensive and consistent national-scale analysis of flood hazard for the conterminous US for both current and future conditions. Even though we consider a stabilization emissions scenario and a near-future time horizon, we project clear patterns of changing flood hazard (3σ changes in 100 years inundated area of −3.8 to +16% at 1° scale), that are significant when considered as a proportion of the land area where human use is possible or in terms of the currently protected land area where the standard of flood defense protection may become compromised by this time. Plain Language Summary We develop a method to estimate past, present, and future flood risk for all properties in the conterminous United States whether affected by river, coastal or rainfall flooding. The analysis accounts for variability within environmental factors including changes in sea level rise, hurricane intensity and landfall locations, precipitation patterns, and river discharge. We show that even for a conservative climate change trajectory we can expect locally significant changes in the land area at risk from floods by 2050, and by this time defenses protecting 2,200 km 2 of land may be compromised. The complete dataset has been made available via a website (https://floodfactor.com/) created by the First Street Foundation in order to increase public awareness of the threat posed by flooding to safety and livelihoods. BATES ET AL.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.