The Seismic Hazard Harmonization in Europe (SHARE) project, which began in June 2009, aims at establishing new standards for probabilistic seismic hazard assessment in the Euro-Mediterranean region. In this context, a logic tree for ground-motion prediction in Europe has been constructed. Ground-motion prediction equations (GMPEs) and weights have been determined so that the logic tree captures epistemic uncertainty in ground-motion prediction for six different tectonic regimes in Europe. Here we present the strategy that we adopted to build such a logic tree. This strategy has the particularity of combining two complementary and independent approaches: expert judgment and data testing. A set of six experts was asked to weight pre-selected GMPEs while the ability of these GMPEs to predict available data was evaluated with the method of Scherbaum et al. (Bull Seismol Soc Am 99:3234-3247, 2009). Results of both approaches were taken into account to commonly select the smallest set of GMPEs to capture the uncertainty in ground-motion prediction in Europe. For stable continental regions, two models, both from eastern North America, have been selected for shields, and three GMPEs from active shallow crustal regions have been added for continental crust. For subduction zones, four models, all non-European, have been chosen. Finally, for active shallow crustal regions, we selected four models, each of them from a different host region but only two of them were kept for long periods. In most cases, a common agreement has been also reached for the weights. In case of divergence, a sensitivity analysis of the weights on the seismic hazard has been conducted, showing that once the GMPEs have been selected, the associated set of weights has a smaller influence on the hazard
The 2016–2017 central Italy seismic sequence occurred on an 80 km long normal-fault system. The sequence initiated with the Mw 6.0 Amatrice event on 24 August 2016, followed by the Mw 5.9 Visso event on 26 October and the Mw 6.5 Norcia event on 30 October. We analyze continuous data from a dense network of 139 seismic stations to build a high-precision catalog of ∼900,000 earthquakes spanning a 1 yr period, based on arrival times derived using a deep-neural-network-based picker. Our catalog contains an order of magnitude more events than the catalog routinely produced by the local earthquake monitoring agency. Aftershock activity reveals the geometry of complex fault structures activated during the earthquake sequence and provides additional insights into the potential factors controlling the development of the largest events. Activated fault structures in the northern and southern regions appear complementary to faults activated during the 1997 Colfiorito and 2009 L’Aquila sequences, suggesting that earthquake triggering primarily occurs on critically stressed faults. Delineated major fault zones are relatively thick compared to estimated earthquake location uncertainties, and a large number of kilometer-long faults and diffuse seismicity were activated during the sequence. These properties might be related to fault age, roughness, and the complexity of inherited structures. The rich details resolvable in this catalog will facilitate continued investigation of this energetic and well-recorded earthquake sequence.
The 2016–2017 Central Italy seismic sequence ruptured overlapping normal faults of the Apennines mountain chain, in nine earthquakes with magnitude Mw > 5 within a few months. Here we investigate the structure of the fault system using an extensive aftershock data set, from joint permanent and temporary seismic networks, and 3‐D Vp and Vp/Vs velocity models. We show that mainshocks nucleated on gently west dipping planes that we interpret as inverted steep ramps inherited from the late Pliocene compression. The two large shocks, the 24 August, Mw = 6.0 Amatrice and the 30 October, Mw = 6.5 Norcia occurred on distinct faults reactivated by high pore pressure at the footwall, as indicated by positive Vp/Vs anomalies. The lateral extent of the overpressurized volume includes the fault patch of the Norcia earthquake. The irregular geometry of normal faults together with the reactivated ramps leads to the kinematic complexity observed during the coseismic ruptures and the spatial distribution of aftershocks.
The 2016–2017 Central Apennines earthquake sequence is a recent example of how damages from subsequent aftershocks can exceed those caused by the initial mainshock. Recent studies reveal that physics‐based aftershock forecasts present comparable skills to their statistical counterparts, but their performance remains a controversial subject. Here we employ physics‐based models that combine the elasto‐static stress transfer with rate‐and‐state friction laws, and short‐term statistical Epidemic Type Aftershock Sequence (ETAS) models to describe the spatiotemporal evolution of the earthquake cascade. We then track the absolute and relative model performance using log‐likelihood statistics for a 1‐year horizon after the 24 August 2016 Mw = 6.0 Amatrice earthquake. We perform a series of pseudoprospective experiments by producing seven classes of Coulomb rate‐state (CRS) forecasts with gradual increase in data input quality and model complexity. Our goal is to investigate the influence of data quality on the predictive power of physics‐based models and to assess the comparative performance of the forecasts in critical time windows, such as the period following the 26 October Visso earthquakes leading to the 30 October Mw = 6.5 Norcia mainshock. We find that (1) the spatiotemporal performance of the basic CRS models is poor and progressively improves as more refined data are used, (2) CRS forecasts are about as informative as ETAS when secondary triggering effects from M3+ earthquakes are included together with spatially variable slip models, spatially heterogeneous receiver faults, and optimized rate‐and‐state parameters. After the Visso earthquakes, the more elaborate CRS model outperforms ETAS highlighting the importance of the static stress transfer for operational earthquake forecasting.
A new generation of earthquake catalogs developed through supervised machine-learning illuminates earthquake activity with unprecedented detail. Application of unsupervised machine learning to analyze the more complete expression of seismicity in these catalogs may be the fastest route to improving earthquake forecasting.The past 5 years have seen a rapidly accelerating effort in applying machine learning to seismological problems. The serial components of earthquake monitoring workflows include: detection, arrival time measurement, phase association, location, and characterization. All of these tasks have seen rapid progress due to effective implementation of machine-learning approaches. They have proven opportune targets for machine learning in seismology mainly due to the large, labeled data sets, which are often publicly available, and that were constructed through decades of dedicated work by skilled analysts. These are the essential ingredient for building complex supervised models. Progress has been realized in research mode to analyze the details of seismicity well after the earthquakes being studied have occurred, and machinelearning techniques are poised to be implemented in operational mode for real-time monitoring. We will soon have a next generation of earthquake catalogs that contain much more information. How much more? These more complete catalogs typically feature at least a factor of ten more earthquakes (Fig. 1) and provide a higher-resolution picture of seismically active faults.This next generation of earthquake catalogs will not be the single, static objects seismologists are accustomed to working with. For example, less than 2 years after the 2019 Ridgecrest, California earthquake sequence there already exist four next-generation catalogs, each of which were developed with different enhanced detection techniques. Now, and in the future, this will be the norm, and earthquake catalogs will be updated and improved-potentially dramaticallywith time. Second-generation deep learning models 1 that are specifically designed based on earthquake signal characteristics and that mimic the manual processing by analysts, can lead to performance increases beyond those offered by earlier models that adapted neural network architectures from other fields. Those interested in using earthquake catalogs for forecasting can anticipate a shifting landscape with continuing improvements.While these improvements are impressive, the value of the extra information they provide is less clear. What will we learn about earthquake behavior from these deeper catalogs and how might it improve the prospects for the stubbornly difficult problem of earthquake forecasting? Short-term deterministic earthquake prediction remains elusive and is perhaps impossible; however, probabilistic earthquake forecasting is another matter. It remains the subject of focused and sustained attention and it informs earthquake hazard characterization 2 and thus both policy and earthquake risk reduction. A key assumption is that what we learn from...
Earthquakes occur as the result of long-term strain accumulation on active faults and complex transient triggering mechanisms. Although laboratory experiments show accelerating deformation patterns before failure conditions are met, imaging similar preparatory phases in nature remains difficult because it requires dense monitoring in advance. The 2016 Amatrice-Visso-Norcia (central Italy) earthquake cascade, captured by an unprecedented seismic network, provided a unique testing ground to image the preparatory phase of a large event. The crustal volume of the Norcia incipient fault was densely illuminated by seismic rays from more than 13,000 earthquakes that occurred within the 3 mo before the main shock nucleation. We performed seismic tomography in distinct time windows that revealed the precursory changes of elastic wave speed, signaling (1) the final locked state of the fault, and (2) the rapid fault-stiffness alterations near the hypocenter just a few weeks before the event. The results are the first instance where short-lived, hard-to-catch crustal properties shed light on evolving earthquake cascades.
Operational earthquake forecasting protocols commonly use statistical models for their recognized ease of implementation and robustness in describing the short-term spatiotemporal patterns of triggered seismicity. However, recent advances on physics-based aftershock forecasting reveal comparable performance to the standard statistical counterparts with significantly improved predictive skills when fault and stress-field heterogeneities are considered. Here, we perform a pseudoprospective forecasting experiment during the first month of the 2019 Ridgecrest (California) earthquake sequence. We develop seven Coulomb rate-and-state models that couple static stress-change estimates with continuum mechanics expressed by the rate-and-state friction laws. Our model parameterization supports a gradually increasing complexity; we start from a preliminary model implementation with simplified slip distributions and spatially homogeneous receiver faults to reach an enhanced one featuring optimized fault constitutive parameters, finite-fault slip models, secondary triggering effects, and spatially heterogenous planes informed by pre-existing ruptures. The data-rich environment of southern California allows us to test whether incorporating data collected in near-real time during an unfolding earthquake sequence boosts our predictive power. We assess the absolute and relative performance of the forecasts by means of statistical tests used within the Collaboratory for the Study of Earthquake Predictability and compare their skills against a standard benchmark epidemic-type aftershock sequence (ETAS) model for the short (24 hr after the two Ridgecrest mainshocks) and intermediate terms (one month). Stress-based forecasts expect heightened rates along the whole near-fault region and increased expected seismicity rates in central Garlock fault. Our comparative model evaluation not only supports that faulting heterogeneities coupled with secondary triggering effects are the most critical success components behind physics-based forecasts, but also underlines the importance of model updates incorporating near-real-time available aftershock data reaching better performance than standard ETAS. We explore the physical basis behind our results by investigating the localized shut down of pre-existing normal faults in the Ridgecrest near-source area.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.