Tsunami fragility curves are statistical models which form a key component of tsunami risk models, as they provide a probabilistic link between a tsunami intensity measure (TIM) and building damage. Existing studies apply different TIMs (e.g. depth, velocity, force etc.) with conflicting recommendations of which to use. This paper presents a rigorous methodology using advanced statistical methods for the selection of the optimal TIM for fragility function derivation for any given dataset. This methodology is demonstrated using a unique, detailed, disaggregated damage dataset from the 2011 Great East Japan earthquake and tsunami (total 67,125 buildings), identifying the optimum TIM for describing observed damage for the case study locations. This paper first presents the proposed methodology, which is broken into three steps: (1) exploratory analysis, (2) statistical model selection and trend analysis and (3) comparison and selection of TIMs. The case study dataset is then presented, and the methodology is then applied to this dataset. In Step 1, exploratory analysis on the case study dataset suggests that fragility curves should be constructed for the sub-categories of engineered (RC and steel) and nonengineered (wood and masonry) construction materials. It is shown that the exclusion of buildings of unknown construction material (common practice in existing studies) may introduce bias in the results; hence, these buildings are estimated as engineered or nonengineered through use of multiple imputation (MI) techniques. In Step 2, a sensitivity analysis of several statistical methods for fragility curve derivation is conducted in order to select multiple statistical models with which to conduct further exploratory analysis and the TIM comparison (to draw conclusions which are non-model-specific). Methods of data aggregation and ordinary least squares parameter estimation (both used in existing studies) are rejected as they are quantitatively shown to reduce fragility curve accuracy and increase uncertainty. Partially ordered probit models and generalised additive models 123Nat Hazards (2016) 84:1257-1285 DOI 10.1007/s11069-016-2485 (GAMs) are selected for the TIM comparison of Step 3. In Step 3, fragility curves are then constructed for a number of TIMs, obtained from numerical simulation of the tsunami inundation of the 2011 GEJE. These fragility curves are compared using K-fold crossvalidation (KFCV), and it is found that for the case study dataset a force-based measure that considers different flow regimes (indicated by Froude number) proves the most efficient TIM. It is recommended that the methodology proposed in this paper be applied for defining future fragility functions based on optimum TIMs. With the introduction of several concepts novel to the field of fragility assessment (MI, GAMs, KFCV for model optimisation and comparison), this study has significant implications for the future generation of empirical and analytical fragility functions.
Tsunami damage, fragility, and vulnerability functions are statistical models that provide an estimate of expected damage or losses due to tsunami. They allow for quantification of risk, and so are a vital component of catastrophe models used for human and financial loss estimation, and for land-use and emergency planning. This paper collates and reviews the currently available tsunami fragility functions in order to highlight the current limitations, outline significant advances in this field, make recommendations for model derivation, and propose key areas for further research. Existing functions are first presented, and then key issues are identified in the current literature for each of the model components: building damage data (the response variable of the statistical model), tsunami intensity data (the explanatory variable), and the statistical model that links the two. Finally, recommendations are made regarding areas for future research and current best practices in deriving tsunami fragility functions (see Discussion, Recommendations, and Future Research). The information presented in this paper may be used to assess the quality of current estimations (both based on the quality of the data, and the quality of the models and methods adopted) and to adopt best practice when developing new fragility functions.
Catastrophe models quantify potential losses from disasters, and are used in the insurance, disaster-risk management, and engineering industries. Tsunami fragility and vulnerability curves are key components of catastrophe models, providing probabilistic links between Tsunami Intensity Measures (TIMs), damage and loss. Building damage due to tsunamis can occur due to fluid forces or debris impact; two effects which have different implications for building damage levels and failure mechanisms. However, existing fragility functions are generally derived using all available damage data for a location, regardless of whether damage was caused by fluid or debris effects. It is therefore not clear whether the inclusion of debris-induced damage introduces bias in existing functions. Furthermore, when modelling areas likely to be affected by debris (e.g., adjacent to ports), it is not possible to account for this increased likelihood of debris-induced damage using existing functions. This paper proposes a methodology to quantify the effect that debris-induced damage has on fragility and vulnerability function derivation, and subsequent loss estimates. A building-by-building damage dataset from the 2011 Great East Japan Earthquake and Tsunami is used, together with several statistical techniques advanced in the field of fragility analysis. First, buildings are identified which are most likely to have been affected by debris from nearby 'washed away' buildings. Fragility functions are then derived incorporating this debris indicator parameter. The debris parameter is shown to be significant for all but the lowest damage state ("minor damage"), and functions which incorporate the debris parameter are shown to have a statistically significant better fit to the observed damage data than models which omit debris information. Finally, for a case study scenario simulated economic loss is compared for estimates from vulnerability functions which do and do not incorporate a debris term. This comparison suggests that biases in loss estimation may be introduced if not explicitly modelling debris. The proposed methodology provides a step towards allowing catastrophe models to more reliably predict the expected damage and losses in areas with increased likelihood of debris, which is of relevance for the engineering, disaster risk-reduction and insurance sectors.
One of the greatest causes of casualties in major earthquakes around the world is the collapse of non-engineered masonry buildings (those built without engineering input). Yet by definition non-engineered structures remain largely outside of the scope of modern engineering research, meaning that the majority of those at risk often remain so. A further barrier to realising research in this field is the significant social and economic challenge of implementation in low-income communities, where non-engineered housing is prevalent. This paper introduces a retrofitting technique aimed at preventing or prolonging the collapse of adobe (mud brick) houses under strong earthquakes. This technique uses common polypropylene packaging straps to form a mesh, which is then used to encase structural walls. The aim of this paper is to give an overview of the retrofitting technique's development and implementation.The key development stages of static, dynamic and numerical testing are presented, showing that the proposed technique effectively prevents brittle masonry collapse and the loss of debris. An implementation project is then discussed, involving a training programme for rural masons in Nepal, a public shake-table demonstration and the retrofit of a real house. The implementation project proved effective at reaching rural communities but highlighted that government subsidies are required to incentivise the safeguarding of homes among low-income communities.
Currently, 8 out of the 10 most populous megacities in the world are vulnerable to severe earthquake damage, while 6 out of 10 are at risk of being severely affected by tsunami. To mitigate ground shaking and tsunami risks for coastal communities, reliable tools for assessing the effects of these hazards on coastal structures are needed. Methods for assessing the seismic performance of buildings and infrastructure are well established, allowing for seismic risk assessments to be performed with some degree of confidence. In the case of tsunami, structural assessment methodologies are much less developed. This stems partly from a general lack of understanding of tsunami inundation processes and flow interaction with the built environment. This chapter brings together novel numerical and experimental work being carried out at UCL EPICentre and highlights advances made in defining tsunami loads for use in structural analysis, and in the assessment of buildings for tsunami loads. The results of this work, however, demonstrate a conflict in the design targets for seismic versus tsunami-resistant structures, which raise questions on how to provide appropriate building resilience in coastal areas subjected to both these hazards. The Chapter therefore concludes by summarizing studies carried out to assess building response under successive earthquakes and tsunami that are starting to address this question.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.