Abstract. This paper analyses how the current loss modelling framework that was developed in the 1990s to respond to Hurricane Andrew market crisis falls short in dealing with today's complexity. In effect, beyond reflecting and supporting the current understanding and knowledge of risks, data and models are used in the assessment of situations that have
not been experienced yet. To address this question, we considered the
(re)insurance market's current body of knowledge on natural hazard loss
modelling, the fruit of over 30 years of research conducted by (re)insurers,
brokers, modelling firms, and other private companies and academics in the
atmospheric sciences, geosciences, civil engineering studies, and data
sciences among others. Our study shows that to successfully manage the
complexity of the interactions between natural elements and the customer
ecosystem, it is essential that both private companies in the insurance
sector and academia continue working together to co-build and share common
data collection and modelling. This paper (i) proves the need to conduct an
in-depth review of the existing loss modelling framework and (ii) makes it
clear that only a transdisciplinary effort will be up to the challenge of
building global loss models. These two factors are essential to capture the
interactions and increasing complexity of the three risk drivers – exposure, hazard, and vulnerability – thus enabling insurers to anticipate
and be equipped to face the far-ranging impacts of climate change and other
natural events.