We investigate the usefulness of complex flood damage models for predicting relative damage to residential buildings in a spatial and temporal transfer context. We apply eight different flood damage models to predict relative building damage for five historic flood events in two different regions of Germany. Model complexity is measured in terms of the number of explanatory variables which varies from 1 variable up to 10 variables which are singled out from 28 candidate variables. Model validation is based on empirical damage data, whereas observation uncertainty is taken into consideration. The comparison of model predictive performance shows that additional explanatory variables besides the water depth improve the predictive capability in a spatial and temporal transfer context, i.e., when the models are transferred to different regions and different flood events. Concerning the trade-off between predictive capability and reliability the model structure seem more important than the number of explanatory variables. Among the models considered, the reliability of Bayesian network-based predictions in space-time transfer is larger than for the remaining models, and the uncertainties associated with damage predictions are reflected more completely.
Abstract. The summer flood of 2013 set a new record for large-scale floods in Germany for at least the last 60 years. In this paper we analyse the key hydro-meteorological factors using extreme value statistics as well as aggregated severity indices. For the long-term classification of the recent flood we draw comparisons to a set of past large-scale flood events in Germany, notably the high-impact summer floods from August 2002 and July 1954. Our analysis shows that the combination of extreme initial wetness at the national scale -caused by a pronounced precipitation anomaly in the month of May 2013 -and strong, but not extraordinary event precipitation were the key drivers for this exceptional flood event. This provides additional insights into the importance of catchment wetness for high return period floods on a large scale. The database compiled and the methodological developments provide a consistent framework for the rapid evaluation of future floods.
Risk-based approaches have been increasingly accepted and operationalized in flood risk management during recent decades. For instance, commercial flood risk models are used by the insurance industry to assess potential losses, establish the pricing of policies and determine reinsurance needs. Despite considerable progress in the development of loss estimation tools since the 1980s, loss estimates still reflect high uncertainties and disparities that often lead to questioning their quality. This requires an assessment of the validity and robustness of loss models as it affects prioritization and investment decision in flood risk management as well as regulatory requirements and business decisions in the insurance industry. Hence, more effort is needed to quantify uncertainties and undertake validations. Due to a lack of detailed and reliable flood loss data, first order validations are difficult to accomplish, so that model comparisons in terms of benchmarking are essential. It is checked if the models are informed by existing data and knowledge and if the assumptions made in the models are aligned with the existing knowledge. When this alignment is confirmed through validation or benchmarking exercises, the user gains confidence in the models. Before these benchmarking exercises are feasible, however, a cohesive survey of existing knowledge needs to be undertaken. With that aim, this work presents a review of flood loss–or flood vulnerability–relationships collected from the public domain and some professional sources. Our survey analyses 61 sources consisting of publications or software packages, of which 47 are reviewed in detail. This exercise results in probably the most complete review of flood loss models to date containing nearly a thousand vulnerability functions. These functions are highly heterogeneous and only about half of the loss models are found to be accompanied by explicit validation at the time of their proposal. This paper exemplarily presents an approach for a quantitative comparison of disparate models via the reduction to the joint input variables of all models. Harmonization of models for benchmarking and comparison requires profound insight into the model structures, mechanisms and underlying assumptions. Possibilities and challenges are discussed that exist in model harmonization and the application of the inventory in a benchmarking framework.
As flood impacts are increasing in large parts of the world, understanding the primary drivers of changes in risk is essential for effective adaptation. To gain more knowledge on the basis of empirical case studies, we analyze eight paired floods, that is, consecutive flood events that occurred in the same region, with the second flood causing significantly lower damage. These success stories of risk reduction were selected across different socioeconomic and hydro‐climatic contexts. The potential of societies to adapt is uncovered by describing triggered societal changes, as well as formal measures and spontaneous processes that reduced flood risk. This novel approach has the potential to build the basis for an international data collection and analysis effort to better understand and attribute changes in risk due to hydrological extremes in the framework of the IAHSs Panta Rhei initiative. Across all case studies, we find that lower damage caused by the second event was mainly due to significant reductions in vulnerability, for example, via raised risk awareness, preparedness, and improvements of organizational emergency management. Thus, vulnerability reduction plays an essential role for successful adaptation. Our work shows that there is a high potential to adapt, but there remains the challenge to stimulate measures that reduce vulnerability and risk in periods in which extreme events do not occur.
ABSTRACT. Widespread flooding in June 2013 caused damage costs of €6 to 8 billion in Germany, and awoke many memories of the floods in August 2002, which resulted in total damage of €11.6 billion and hence was the most expensive natural hazard event in Germany up to now. The event of 2002 does, however, also mark a reorientation toward an integrated flood risk management system in Germany. Therefore, the flood of 2013 offered the opportunity to review how the measures that politics, administration, and civil society have implemented since 2002 helped to cope with the flood and what still needs to be done to achieve effective and more integrated flood risk management. The review highlights considerable improvements on many levels, in particular (1) an increased consideration of flood hazards in spatial planning and urban development, (2) comprehensive property-level mitigation and preparedness measures, (3) more effective flood warnings and improved coordination of disaster response, and (4) a more targeted maintenance of flood defense systems. In 2013, this led to more effective flood management and to a reduction of damage. Nevertheless, important aspects remain unclear and need to be clarified. This particularly holds for balanced and coordinated strategies for reducing and overcoming the impacts of flooding in large catchments, cross-border and interdisciplinary cooperation, the role of the general public in the different phases of flood risk management, as well as a transparent risk transfer system. Recurring flood events reveal that flood risk management is a continuous task. Hence, risk drivers, such as climate change, land-use changes, economic developments, or demographic change and the resultant risks must be investigated at regular intervals, and risk reduction strategies and processes must be reassessed as well as adapted and implemented in a dialogue with all stakeholders.
Reliable flood damage assessment is important for decision-making in flood risk management.Flood damage assessment is often done with damage curves based only on water depth. These depthdamage curves are usually developed based on data from a specific location and specific flood conditions. Such depth-damage curves tend to be applied outside the scope of their validity. Validation studies show that in such cases depth-damage curve are not very reliable, probably due to excluded influencing variables. The expectation is that the inclusion of more variables in a damage function will improve its transferability. We compare multi-variable models based on Bayesian Networks and Random Forests developed on the basis of flood damage data sets from Germany and The Netherlands. The performance of the models is tested on a validation sub-set of both countries' data. The models are also updated with data from the other country and then tested again. The results show that the German models (BN/RF-FLEMOps) perform better in the Netherlands than the Dutch models (BN/RF-Meuse) perform in Germany. This is probably because the FLEMOps models are based on more heterogeneous data than the Meuse models. The FLEMOps models, therefore, are better able to capture damages processes from other events and in other locations. Model performance improves via updating the models with data from the location to which the model is transferred to. The results show that there is high potential to develop improved damage models, by training multi-variable models with heterogeneous data, for example from multiple flood events and locations.
Abstract. During and shortly after a disaster, data about the hazard and its consequences are scarce and not readily available. Information provided by eyewitnesses via social media is a valuable information source, which should be explored in a more effective way. This research proposes a methodology that leverages social media content to support rapid inundation mapping, including inundation extent and water depth in the case of floods. The novelty of this approach is the utilization of quantitative data that are derived from photos from eyewitnesses extracted from social media posts and their integration with established data. Due to the rapid availability of these posts compared to traditional data sources such as remote sensing data, areas affected by a flood, for example, can be determined quickly. The challenge is to filter the large number of posts to a manageable amount of potentially useful inundation-related information, as well as to interpret and integrate the posts into mapping procedures in a timely manner. To support rapid inundation mapping we propose a methodology and develop "PostDistiller", a tool to filter geolocated posts from social media services which include links to photos. This spatial distributed contextualized in situ information is further explored manually. In an application case study during the June 2013 flood in central Europe we evaluate the utilization of this approach to infer spatial flood patterns and inundation depths in the city of Dresden.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.