A Prototype Operational Earthquake Loss Model for California Based on UCERF3-ETAS – A First Look at Valuation

2017 ◽  
Vol 33 (4) ◽  
pp. 1279-1299 ◽  
Author(s):  
Edward Field ◽  
Keith Porter ◽  
Kevin Milner

We present a prototype operational loss model based on UCERF3-ETAS, which is the third Uniform California Earthquake Rupture Forecast with an Epidemic Type Aftershock Sequence (ETAS) component. As such, UCERF3-ETAS represents the first earthquake forecast to relax fault segmentation assumptions and to include multi-fault ruptures, elastic-rebound, and spatiotemporal clustering, all of which seem important for generating realistic and useful aftershock statistics. UCERF3-ETAS is nevertheless an approximation of the system, however, so usefulness will vary and potential value needs to be ascertained in the context of each application. We examine this question with respect to statewide loss estimates, exemplifying how risk can be elevated by orders of magnitude due to triggered events following various scenario earthquakes. Two important considerations are the probability gains, relative to loss likelihoods in the absence of main shocks, and the rapid decay of gains with time. Significant uncertainties and model limitations remain, so we hope this paper will inspire similar analyses with respect to other risk metrics to help ascertain whether operationalization of UCERF3-ETAS would be worth the considerable resources required.

2011 ◽  
Vol 18 (4) ◽  
pp. 477-487 ◽  
Author(s):  
A. Jiménez ◽  
F. Luzón

Abstract. On 18 September 2004, an earthquake of magnitude mbLg = 4.6 was recorded near the Itoiz dam (Northern Spain). It occurred after the first impoundment of the reservoir and has been catalogued by some authors as induced seismicity. We analyzed the seismicity in the region as weighted complex networks and tried to differentiate this event from others that occurred nearby. We calculated the main topological features of the networks formed by the seismic clusters and compared them. We compared the results with a series of simulations, and showed that the clusters were better modelled with the Epidemic-Type Aftershock Sequence (ETAS) model than with random models. We found that the properties of the different clusters are grouped according to the magnitude of the main shocks and the number of events in each cluster, and that no distinct feature could be obtained for the 18 September 2004 series. We found that the nodes with the highest strength are the most important in the networks' traffic, and are associated with the events with the highest magnitude within the clusters.


2011 ◽  
Vol 18 (6) ◽  
pp. 955-966 ◽  
Author(s):  
M. B. Yıkılmaz ◽  
E. M. Heien ◽  
D. L. Turcotte ◽  
J. B. Rundle ◽  
L. H. Kellogg

Abstract. We generate synthetic catalogs of seismicity in northern California using a composite simulation. The basis of the simulation is the fault based "Virtual California" (VC) earthquake simulator. Back-slip velocities and mean recurrence intervals are specified on model strike-slip faults. A catalog of characteristic earthquakes is generated for a period of 100 000 yr. These earthquakes are predominantly in the range M = 6 to M = 8, but do not follow Gutenberg-Richter (GR) scaling at lower magnitudes. In order to model seismicity on unmapped faults we introduce background seismicity which occurs randomly in time with GR scaling and is spatially associated with the VC model faults. These earthquakes fill in the GR scaling down to M = 4 (the smallest earthquakes modeled). The rate of background seismicity is constrained by the observed rate of occurrence of M > 4 earthquakes in northern California. These earthquakes are then used to drive the BASS (branching aftershock sequence) model of aftershock occurrence. The BASS model is the self-similar limit of the ETAS (epidemic type aftershock sequence) model. Families of aftershocks are generated following each Virtual California and background main shock. In the simulations the rate of occurrence of aftershocks is essentially equal to the rate of occurrence of main shocks in the magnitude range 4 < M < 7. We generate frequency-magnitude and recurrence interval statistics both regionally and fault specific. We compare our modeled rates of seismicity and spatial variability with observations.


2020 ◽  
pp. 875529302093881
Author(s):  
Athanasios N Papadopoulos ◽  
Paolo Bazzurro

In current practice, most earthquake risk models adopt a “declustered” view of seismicity, that is, they disregard foreshock, aftershock, and triggered earthquakes and model seismicity as a series of independent mainshock events, whose occurrence (typically) conforms to a Poisson process. This practice is certainly disputable but has been justified by the false notion that earthquakes of smaller magnitude than the mainshock cannot induce further damage than what was caused by the latter. A companion paper makes use of the epidemic-type aftershock sequence (ETAS) model fitted to Central Italy seismicity data to describe the full earthquake occurrence process, including “dependent” earthquakes. Herein, loss estimates for the region of Umbria in Central Italy are derived using stochastic event catalogs generated by means of the ETAS model and damage-dependent fragility functions to track damage accumulation. The results are then compared with estimates obtained with a conventional Poisson-based model. The potential gains of utilizing a model capable of capturing the spatiotemporal clustering features of seismicity are illustrated along with a discussion on the various details and challenges of such considerations.


Author(s):  
Christian Grimm ◽  
Martin Käser ◽  
Sebastian Hainzl ◽  
Marco Pagani ◽  
Helmut Küchenhoff

ABSTRACT Earthquake sequences add a substantial hazard beyond the solely declustered perspective of common probabilistic seismic hazard analysis. A particularly strong driver for both social and economic losses are so-called earthquake doublets (more generally multiplets), that is, sequences of two (or more) comparatively large events in spatial and temporal proximity. Without differentiating between foreshocks and aftershocks, we hypothesize three main influencing factors of doublet occurrence: (1) the number of direct and secondary aftershocks triggered by an earthquake; (2) the occurrence of independent clusters and seismic background events in the same time–space window; and (3) the magnitude size distribution of triggered events (in contrast to independent events). We tested synthetic catalogs simulated by a standard epidemic-type aftershock sequence (ETAS) model for both Japan and southern California. Our findings show that the common ETAS approach significantly underestimates doublet frequencies compared with observations in historical catalogs. In combination with that the simulated catalogs show a smoother spatiotemporal clustering compared with the observed counterparts. Focusing on the impact on direct aftershock productivity and total cluster sizes, we propose two modifications of the ETAS spatial kernel to improve doublet rate predictions: (a) a restriction of the spatial function to a maximum distance of 2.5 estimated rupture lengths and (b) an anisotropic function with contour lines constructed by a box with two semicircular ends around the estimated rupture segment. These modifications shift the triggering potential from weaker to stronger events and consequently improve doublet rate predictions for larger events, despite still underestimating historic doublet occurrence rates. Besides, the results for the restricted spatial functions fulfill better the empirical Båth’s law for the maximum aftershock magnitude. The tested clustering properties of strong events are not sufficiently incorporated in typically used global catalog scale measures, such as log-likelihood values, which would favor the conventional, unrestricted models.


2018 ◽  
Vol 17 (4) ◽  
pp. 1795-1823 ◽  
Author(s):  
Hooman Motamed ◽  
Alejandro Calderon ◽  
Vitor Silva ◽  
Catarina Costa
Keyword(s):  

2015 ◽  
Vol 57 (6) ◽  
Author(s):  
Maura Murru ◽  
Jiancang Zhuang ◽  
Rodolfo Console ◽  
Giuseppe Falcone

<div class="page" title="Page 1"><div class="layoutArea"><div class="column"><p>In this paper, we compare the forecasting performance of several statistical models, which are used to describe the occurrence process of earthquakes in forecasting the short-term earthquake probabilities during the L’Aquila earthquake sequence in central Italy in 2009. These models include the Proximity to Past Earthquakes (PPE) model and two versions of the Epidemic Type Aftershock Sequence (ETAS) model. We used the information gains corresponding to the Poisson and binomial scores to evaluate the performance of these models. It is shown that both ETAS models work better than the PPE model. However, in comparing the two types of ETAS models, the one with the same fixed exponent coefficient (<span>alpha)</span> = 2.3 for both the productivity function and the scaling factor in the spatial response function (ETAS I), performs better in forecasting the active aftershock sequence than the model with different exponent coefficients (ETAS II), when the Poisson score is adopted. ETAS II performs better when a lower magnitude threshold of 2.0 and the binomial score are used. The reason is found to be that the catalog does not have an event of similar magnitude to the L’Aquila mainshock (M<sub>w</sub> 6.3) in the training period (April 16, 2005 to March 15, 2009), and the (<span>alpha)</span>-value is underestimated, thus the forecast seismicity is underestimated when the productivity function is extrapolated to high magnitudes. We also investigate the effect of the inclusion of small events in forecasting larger events. These results suggest that the training catalog used for estimating the model parameters should include earthquakes of magnitudes similar to the mainshock when forecasting seismicity during an aftershock sequence.</p></div></div></div>


Author(s):  
Edward H. Field ◽  
Kevin R. Milner ◽  
Nicolas Luco

ABSTRACT We use the Third Uniform California Earthquake Rupture Forecast (UCERF3) epidemic-type aftershock sequence (ETAS) model (UCERF3-ETAS) to evaluate the effects of declustering and Poisson assumptions on seismic hazard estimates. Although declustering is necessary to infer the long-term spatial distribution of earthquake rates, the question is whether it is also necessary to honor the Poisson assumption in classic probabilistic seismic hazard assessment. We use 500,000 yr, M ≥ 2.5 synthetic catalogs to address this question, for which UCERF3-ETAS exhibits realistic spatiotemporal clustering effects (e.g., aftershocks). We find that Gardner and Knopoff (1974) declustering, used in the U.S. Geological Survey seismic hazard models, lowers 2% in 50 yr and risk-targeted ground-motion hazard metrics by about 4% on average (compared with the full time-dependent [TD] model), with the reduction being 5% at 40% in 50 yr ground motions. Keeping all earthquakes and treating them as a Poisson process increases these same hazard metrics by about 3%–12%, on average, due to the removal of relatively quiet time periods in the full TD model. In the interest of model simplification, bias minimization, and consideration of the probabilities of multiple exceedances, we agree with others (Marzocchi and Taroni, 2014) that we are better off keeping aftershocks and treating them as a Poisson process rather than removing them from hazard consideration via declustering. Honoring the true time dependence, however, will likely be important for other hazard and risk metrics, and this study further exemplifies how this can now be evaluated more extensively.


2020 ◽  
Vol 91 (3) ◽  
pp. 1567-1578 ◽  
Author(s):  
Kevin R. Milner ◽  
Edward H. Field ◽  
William H. Savran ◽  
Morgan T. Page ◽  
Thomas H. Jordan

Abstract The first Uniform California Earthquake Rupture Forecast, Version 3–epidemic-type aftershock sequence (UCERF3-ETAS) aftershock simulations were running on a high-performance computing cluster within 33 min of the 4 July 2019 M 6.4 Searles Valley earthquake. UCERF3-ETAS, an extension of the third Uniform California Earthquake Rupture Forecast (UCERF3), is the first comprehensive, fault-based, epidemic-type aftershock sequence (ETAS) model. It produces ensembles of synthetic aftershock sequences both on and off explicitly modeled UCERF3 faults to answer a key question repeatedly asked during the Ridgecrest sequence: What are the chances that the earthquake that just occurred will turn out to be the foreshock of an even bigger event? As the sequence unfolded—including one such larger event, the 5 July 2019 M 7.1 Ridgecrest earthquake almost 34 hr later—we updated the model with observed aftershocks, finite-rupture estimates, sequence-specific parameters, and alternative UCERF3-ETAS variants. Although configuring and running UCERF3-ETAS at the time of the earthquake was not fully automated, considerable effort had been focused in 2018 on improving model documentation and ease of use with a public GitHub repository, command line tools, and flexible configuration files. These efforts allowed us to quickly respond and efficiently configure new simulations as the sequence evolved. Here, we discuss lessons learned during the Ridgecrest sequence, including sensitivities of fault triggering probabilities to poorly constrained finite-rupture estimates and model assumptions, as well as implications for UCERF3-ETAS operationalization.


2020 ◽  
Vol 110 (2) ◽  
pp. 874-885
Author(s):  
David Marsan ◽  
Yen Joe Tan

ABSTRACT We define a seismicity model based on (1) the epidemic-type aftershock sequence model that accounts for earthquake clustering, and (2) a closed slip budget at long timescale. This is achieved by not permitting an earthquake to have a seismic moment greater than the current seismic moment deficit. This causes the Gutenberg–Richter law to be modulated by a smooth upper cutoff, the location of which can be predicted from the model parameters. We investigate the various regimes of this model that more particularly include a regime in which the activity does not die off even with a vanishingly small spontaneous (i.e., background) earthquake rate and one that bears strong statistical similarities with repeating earthquake time series. Finally, this model relates the earthquake rate and the geodetic moment rate and, therefore, allows to make sense of this relationship in terms of fundamental empirical law (the Gutenberg–Richter law, the productivity law, and the Omori law) and physical parameters (seismic coupling, tectonic loading rate).


Author(s):  
G Petrillo ◽  
E Lippiello

Summary The Epidemic Type Aftershock Sequence (ETAS) model provides a good description of the post-seismic spatio-temporal clustering of seismicity and is also able to capture some features of the increase of seismic activity caused by foreshocks. Recent results, however, have shown that the number of foreshocks observed in instrumental catalogs is significantly much larger than the one predicted by the ETAS model. Here we show that it is possible to keep an epidemic description of post-seismic activity and, at the same time, to incorporate pre-seismic temporal clustering, related to foreshocks. Taking also into-account the short-term incompleteness of instrumental catalogs, we present a model which achieves very good description of the southern California seismicity both on the aftershock and on the foreshock side. Our results indicate that the existence of a preparatory phase anticipating mainshocks represents the most plausible explanation for the occurrence of foreshocks.


Sign in / Sign up

Export Citation Format

Share Document