The 2014 Working Group on California Earthquake Probabilities (WGCEP14) present the time-independent component of the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3), which provides authoritative estimates of the magnitude, location, and time-averaged frequency of potentially damaging earthquakes in California. The primary achievements have been to relax fault segmentation and include multifault ruptures, both limitations of UCERF2. The rates of all earthquakes are solved for simultaneously and from a broader range of data, using a system-level inversion that is both conceptually simple and extensible. The inverse problem is large and underdetermined, so a range of models is sampled using an efficient simulated annealing algorithm. The approach is more derivative than prescriptive (e.g., magnitude-frequency distributions are no longer assumed), so new analysis tools were developed for exploring solutions. Epistemic uncertainties were also accounted for using 1440 alternative logic-tree branches, necessitating access to supercomputers. The most influential uncertainties include alternative deformation models (fault slip rates), a new smoothed seismicity algorithm, alternative values for the total rate of M w ≥ 5 events, and different scaling relationships, virtually all of which are new. As a notable first, three deformation models are based on kinematically consistent inversions of geodetic and geologic data, also providing slip-rate constraints on faults previously excluded due to lack of geologic data. The grand inversion constitutes a system-level framework for testing hypotheses and balancing the influence of different experts. For example, we demonstrate serious challenges with the Gutenberg-Richter hypothesis for individual faults. UCERF3 is still an approximation of the system, however, and the range of models is limited (e.g., constrained to stay close to UCERF2). Nevertheless, UCERF3 removes the apparent UCERF2 overprediction of M 6.5-7 earthquake rates and also includes types of multifault ruptures seen in nature. Although UCERF3 fits the data better than UCERF2 overall, there may be areas that warrant further site-specific investigation. Supporting products may be of general interest, and we list key assumptions and avenues for future model improvements. Manuscript OrganizationBecause of manuscript length and model complexity, we begin with an outline of this report to help readers navigate the various sections:
Summary Many problems in physical science involve the estimation of a number of unknown parameters which bear a linear or quasi‐linear relationship to a set of experimental data. The data may be contaminated by random errors, insufficient to determine the unknowns, redundant, or all of the above. This paper presents a method of optimizing the conclusions from such a data set. The problem is formulated as an ill‐posed matrix equation, and general criteria are established for constructing an ‘inverse’ matrix. The ‘solution’ to the problem is defined in terms of a set of generalized eigenvectors of the matrix, and may be chosen to optimize the resolution provided by the data, the expected error in the solution, the fit to the data, the proximity of the solution to an arbitrary function, or any combination of the above. The classical ‘least‐squares’ solution is discussed as a special case.
We have initially developed a time-independent forecast for southern California by smoothing the locations of magnitude 2 and larger earthquakes. We show that using small m Ն2 earthquakes gives a reasonably good prediction of m Ն5 earthquakes. Our forecast outperforms other time-independent models (Kagan and Jackson, 1994;Frankel et al., 1997), mostly because it has higher spatial resolution. We have then developed a method to estimate daily earthquake probabilities in southern California by using the Epidemic Type Earthquake Sequence model (Kagan and Knopoff, 1987;Ogata, 1988;Kagan and Jackson, 2000). The forecasted seismicity rate is the sum of a constant background seismicity, proportional to our timeindependent model, and of the aftershocks of all past earthquakes. Each earthquake triggers aftershocks with a rate that increases exponentially with its magnitude and decreases with time following Omori's law. We use an isotropic kernel to model the spatial distribution of aftershocks for small (m Յ5.5) mainshocks. For larger events, we smooth the density of early aftershocks to model the density of future aftershocks. The model also assumes that all earthquake magnitudes follow the Gutenberg-Richter law with a uniform b-value. We use a maximum likelihood method to estimate the model parameters and test the short-term and time-independent forecasts. A retrospective test using a daily update of the forecasts between 1 January 1985 and 10 March 2004 shows that the short-term model increases the average probability of an earthquake occurrence by a factor 11.5 compared with the time-independent forecast.
We present long‐term and short‐term forecasts for magnitude 5.8 and larger earthquakes. We discuss a method for optimizing both procedures and testing their forecasting effectiveness using the likelihood function. Our forecasts are expressed as the rate density (that is, the probability per unit area and time) anywhere on the Earth. Our forecasts are for scientific testing only; they are not to be construed as earthquake predictions or warnings, and they carry no official endorsement. For our long‐term forecast we assume that the rate density is proportional to a smoothed version of past seismicity (using the Harvard CMT catalogue). This is in some ways antithetical to the seismic gap model, which assumes that recent earthquakes deter future ones. The estimated rate density depends linearly on the magnitude of past earthquakes and approximately on a negative power of the epicentral distance out to a few hundred kilometres. We assume no explicit time dependence, although the estimated rate density will vary slightly from day to day as earthquakes enter the catalogue. The forecast applies to the ensemble of earthquakes during the test period. It is not meant to predict any single earthquake, and no single earthquake or lack of one is adequate to evaluate such a hypothesis. We assume that 1 per cent of all earthquakes are surprises, assumed uniformly likely to occur in those areas with no earthquakes since 1977. We have made specific forecasts for the calendar year 1999 for the Northwest Pacific and Southwest Pacific regions, and we plan to expand the forecast to the whole Earth. We test the forecast against the earthquake catalogue using a likelihood test and present the results. Our short‐term forecast, updated daily, makes explicit use of statistical models describing earthquake clustering. Like the long‐term forecast, the short‐term version is expressed as a rate density in location, magnitude and time. However, the short‐term forecasts will change significantly from day to day in response to recent earthquakes. The forecast applies to main shocks, aftershocks, aftershocks of aftershocks, and main shocks preceded by foreshocks. However, there is no need to label each event, and the method is completely automatic. According to the model, nearly 10 per cent of moderately sized earthquakes will be followed by larger ones within a few weeks.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.