Occurrence of earthquake doublets in the light of the ETAS model

Author(s):  
Christian Grimm ◽  
Martin Käser ◽  
Helmut Küchenhoff

<p>While Probabilistic Seismic Hazard Assessment is commonly based on earthquake catalogues in a declustered form, ongoing seismicity in aftershock sequences is known to be able to add significant hazard, which can also increase the damage potential to already affected structures in risk assessment. Especially so-called earthquake doublets (multiplets), i.e. a cluster mainshock being followed or preceded by one (or more) events with a similarly strong magnitude occurring within pre-defined temporal and spatial limits, can cause loss multiplication effects to the insurance industry, which therefore has a pronounced interest in investigating the frequency of earthquake doublets to happen worldwide. A widely used method to analyse and simulate the triggering process of earthquake sequences is the Epidemic Type Aftershock Sequence (ETAS) model. We estimate the ETAS model parameters for some regional areas and produce synthetic catalogues, which are then analysed particularly with respect to the occurrence of earthquake doublets and compared to the observed history. Also, different seismic subduction-type regions in the world are pointed out to have shown differing relative frequencies of earthquake doublets. Regression models are used to study whether certain mainshock and local, geophysical properties such as magnitude, dip and rake angle, depth, distance to subduction plate interface and velocity of converging subduction plates nearby show explanatory power for the probability of a cluster containing an earthquake doublet.</p>

2020 ◽  
Vol 91 (3) ◽  
pp. 1567-1578 ◽  
Author(s):  
Kevin R. Milner ◽  
Edward H. Field ◽  
William H. Savran ◽  
Morgan T. Page ◽  
Thomas H. Jordan

Abstract The first Uniform California Earthquake Rupture Forecast, Version 3–epidemic-type aftershock sequence (UCERF3-ETAS) aftershock simulations were running on a high-performance computing cluster within 33 min of the 4 July 2019 M 6.4 Searles Valley earthquake. UCERF3-ETAS, an extension of the third Uniform California Earthquake Rupture Forecast (UCERF3), is the first comprehensive, fault-based, epidemic-type aftershock sequence (ETAS) model. It produces ensembles of synthetic aftershock sequences both on and off explicitly modeled UCERF3 faults to answer a key question repeatedly asked during the Ridgecrest sequence: What are the chances that the earthquake that just occurred will turn out to be the foreshock of an even bigger event? As the sequence unfolded—including one such larger event, the 5 July 2019 M 7.1 Ridgecrest earthquake almost 34 hr later—we updated the model with observed aftershocks, finite-rupture estimates, sequence-specific parameters, and alternative UCERF3-ETAS variants. Although configuring and running UCERF3-ETAS at the time of the earthquake was not fully automated, considerable effort had been focused in 2018 on improving model documentation and ease of use with a public GitHub repository, command line tools, and flexible configuration files. These efforts allowed us to quickly respond and efficiently configure new simulations as the sequence evolved. Here, we discuss lessons learned during the Ridgecrest sequence, including sensitivities of fault triggering probabilities to poorly constrained finite-rupture estimates and model assumptions, as well as implications for UCERF3-ETAS operationalization.


Author(s):  
Eugenio Lippiello ◽  
Cataldo Godano ◽  
Lucilla De Arcangelis

An increase of seismic activity is often observed before large earthquakes. Events responsible for this increase are usually named foreshock and their occurrence probably represents the most reliable precursory pattern. Many foreshocks statistical features can be interpreted in terms of the standard mainshock-to-aftershock triggering process and are recovered in the Epidemic Type Aftershock Sequence ETAS model. Here we present a statistical study of instrumental seismic catalogs from four different geographic regions. We focus on some common features of foreshocks in the four catalogs which cannot be reproduced by the ETAS model. In particular we find in instrumental catalogs a significantly larger number of foreshocks than the one predicted by the ETAS model. We show that this foreshock excess cannot be attributed to catalog incompleteness. We therefore propose a generalized formulation of the ETAS model, the ETAFS model, which explicitly includes foreshock occurrence. Statistical features of aftershocks and foreshocks in the ETAFS model are in very good agreement with instrumental results.


Entropy ◽  
2019 ◽  
Vol 21 (2) ◽  
pp. 173 ◽  
Author(s):  
Eugenio Lippiello ◽  
Cataldo Godano ◽  
Lucilla de Arcangelis

An increase of seismic activity is often observed before large earthquakes. Events responsible for this increase are usually named foreshock and their occurrence probably represents the most reliable precursory pattern. Many foreshocks statistical features can be interpreted in terms of the standard mainshock-to-aftershock triggering process and are recovered in the Epidemic Type Aftershock Sequence ETAS model. Here we present a statistical study of instrumental seismic catalogs from four different geographic regions. We focus on some common features of foreshocks in the four catalogs which cannot be reproduced by the ETAS model. In particular we find in instrumental catalogs a significantly larger number of foreshocks than the one predicted by the ETAS model. We show that this foreshock excess cannot be attributed to catalog incompleteness. We therefore propose a generalized formulation of the ETAS model, the ETAFS model, which explicitly includes foreshock occurrence. Statistical features of aftershocks and foreshocks in the ETAFS model are in very good agreement with instrumental results.


2011 ◽  
Vol 11 (3) ◽  
pp. 697-706 ◽  
Author(s):  
C. S. Jiang ◽  
Z. L. Wu

Abstract. Pattern Informatics (PI) algorithm uses earthquake catalogues for estimating the increase of the probability of strong earthquakes. The main measure in the algorithm is the number of earthquakes above a threshold magnitude. Since aftershocks occupy a significant proportion of the total number of earthquakes, whether de-clustering affects the performance of the forecast is one of the concerns in the application of this algorithm. This problem is of special interest after a great earthquake, when aftershocks become predominant in regional seismic activity. To investigate this problem, the PI forecasts are systematically analyzed for the Sichuan-Yunnan region of southwest China. In this region there have occurred some earthquakes larger than MS 7.0, including the 2008 Wenchuan earthquake. In the analysis, the epidemic-type aftershock sequences (ETAS) model was used for de-clustering. The PI algorithm was revised to consider de-clustering, by replacing the number of earthquakes by the sum of the ETAS-assessed probability for an event to be a "background event" or a "clustering event". Case studies indicate that when an intense aftershock sequence is included in the "sliding time window", the hotspot picture may vary, and the variation lasts for about one year. PI forecasts seem to be affected by the aftershock sequence included in the "anomaly identifying window", and the PI forecast using "background events" seems to have a better performance.


2021 ◽  
Author(s):  
Christian Grimm ◽  
Sebastian Hainzl ◽  
Martin Käser ◽  
Helmut Küchenhoff

Abstract Strong earthquakes cause aftershock sequences that are clustered in time according to a power decay law, and in space along their extended rupture, shaping a typically elongate pattern of aftershock locations. A widely used approach to model seismic clustering is the Epidemic Type Aftershock Sequence (ETAS) model, that shows three major biases: First, the conventional ETAS approach assumes isotropic spatial triggering, which stands in conflict with observations and geophysical arguments for strong earthquakes. Second, the spatial kernel has unlimited extent, allowing smaller events to exert disproportionate trigger potential over an unrealistically large area. Third, the ETAS model assumes complete event records and neglects inevitable short-term aftershock incompleteness as a consequence of overlapping coda waves. These three effects can substantially bias the parameter estimation and particularly lead to underestimated cluster sizes. In this article, we combine the approach of Grimm (2021), which introduced a generalized anisotropic and locally restricted spatial kernel, with the ETAS-Incomplete (ETASI) time model of Hainzl (2021), to define an ETASI space-time model with flexible spatial kernel that solves the abovementioned shortcomings. We apply different model versions to a triad of forecasting experiments of the 2019 Ridgecrest sequence, and evaluate the prediction quality with respect to cluster size, largest aftershock magnitude and spatial distribution. The new model provides the potential of more realistic simulations of on-going aftershock activity, e.g.~allowing better predictions of the probability and location of a strong, damaging aftershock, which might be beneficial for short term risk assessment and desaster response.


Author(s):  
Gordon J. Ross

ABSTRACT The epidemic-type aftershock sequence (ETAS) model is widely used in seismic forecasting. However, most studies of ETAS use point estimates for the model parameters, which ignores the inherent uncertainty that arises from estimating these from historical earthquake catalogs, resulting in misleadingly optimistic forecasts. In contrast, Bayesian statistics allows parameter uncertainty to be explicitly represented and fed into the forecast distribution. Despite its growing popularity in seismology, the application of Bayesian statistics to the ETAS model has been limited by the complex nature of the resulting posterior distribution, which makes it infeasible to apply to catalogs containing more than a few hundred earthquakes. To combat this, we develop a new framework for estimating the ETAS model in a fully Bayesian manner, which can be efficiently scaled up to large catalogs containing thousands of earthquakes. We also provide easy-to-use software that implements our method.


2019 ◽  
Vol 220 (2) ◽  
pp. 856-875
Author(s):  
Ourania Mangira ◽  
Rodolfo Console ◽  
Eleftheria Papadimitriou ◽  
Maura Murru ◽  
Vasilios Karakostas

SUMMARY Earthquake clustering in the area of Central Ionian Islands (Greece) is statistically modelled by means of the Epidemic Type Aftershock Sequence (ETAS) branching model, which is the most popular among the short-term earthquake clustering models. It is based upon the assumption that an earthquake is not fully related to any other one in particular, but rather to both all previous events, and the background seismicity. The close temporal proximity of the strong ($M \ge 6.0$) events in the study area offers the opportunity to retrospectively test the validity of the ETAS model through the 2014 Kefalonia doublet (Mw 6.1 and Mw 6.0) and the 2015 Lefkada aftershock sequences. The application of a physics-based earthquake simulator to the local fault system produced a simulated catalogue with time, space and magnitude behaviour in line with the observed seismicity. This catalogue is then used for the detection of short-term interactions between both strong and smaller events and the comparison between the two cases. The results show that the suggested clustering model provides reliable forecasts of the aftershock activity. Combining the ETAS model and the simulator code, though, needs to be more deeply examined since the preliminary results show some discrepancy between the estimated model parameters.


2019 ◽  
Vol 109 (6) ◽  
pp. 2145-2158 ◽  
Author(s):  
Andrea L. Llenos ◽  
Andrew J. Michael

Abstract Earthquake swarms, typically modeled as time‐varying changes in background seismicity, which are driven by external processes such as fluid flow or aseismic creep, present challenges for operational earthquake forecasting. Although the time decay of aftershock sequences can be estimated with the modified Omori law, it is difficult to forecast the temporal behavior of seismicity rates during a swarm. To explore these issues, we apply the epidemic‐type aftershock sequence (ETAS) model (Ogata, 1988) to the 2015 San Ramon, California swarm, which lasted several weeks and had almost 100 2≤M≤3.6 earthquakes. We develop three‐day forecasts during the swarm based on an ETAS model fit to all prior seismicity in the region as well as an ETAS model fit only to previous swarms in the region, which is better at capturing the higher background rate during the swarm. We also explore forecasts in which the background rate is updated periodically during the swarm using data over different lookback windows and find generally these models perform better than the models in which the background rate is fixed. Finally, we construct ensemble forecasts by combining the different models weighted according to their performance. The ensemble forecasts outperform all of the individual models and allow us to avoid making arbitrary choices at the outset of a swarm as to which single model will perform the best.


2019 ◽  
Vol 71 (1) ◽  
Author(s):  
Takao Kumazawa ◽  
Yosihiko Ogata ◽  
Hiroshi Tsuruoka

AbstractWe applied the epidemic type aftershock sequence (ETAS) model, the two-stage ETAS model and the non-stationary ETAS model to investigate the detailed features of the series of earthquake occurrences before and after the M6.7 Hokkaido Eastern Iburi earthquake on 6 September 2018, based on earthquake data from October 1997. First, after the 2003 M8.0 Tokachi-Oki earthquake, seismic activity in the Eastern Iburi region reduced relative to the ETAS model. During this period, the depth ranges of the seismicity were migrating towards shallow depths, where a swarm cluster, including a M5.1 earthquake, finally occurred in the deepest part of the range. This swarm activity was well described by the non-stationary ETAS model until the M6.7 main shock. The aftershocks of the M6.7 earthquake obeyed the ETAS model until the M5.8 largest aftershock, except for a period of several days when small, swarm-like activity was found at the southern end of the aftershock region. However, when we focus on the medium and larger aftershocks, we observed quiescence relative to the ETAS model from 8.6 days after the main shock until the M5.8 largest aftershock. For micro-earthquakes, we further studied the separated aftershock sequences in the naturally divided aftershock volumes. We found that the temporal changes in the background rate and triggering coefficient (aftershock productivity) in respective sub-volumes were in contrast with each other. In particular, relative quiescence was seen in the northern deep zones that includes the M5.8 largest aftershock. Furthermore, changes in the b-values of the whole aftershock activity showed an increasing trend with respect to the logarithm of elapsed time during the entire aftershock period, which is ultimately explained by the spatially different characteristics of the aftershocks.


2021 ◽  
Vol 73 (1) ◽  
Author(s):  
Keitaro Ohno ◽  
Yusaku Ohta ◽  
Satoshi Kawamoto ◽  
Satoshi Abe ◽  
Ryota Hino ◽  
...  

AbstractRapid estimation of the coseismic fault model for medium-to-large-sized earthquakes is key for disaster response. To estimate the coseismic fault model for large earthquakes, the Geospatial Information Authority of Japan and Tohoku University have jointly developed a real-time GEONET analysis system for rapid deformation monitoring (REGARD). REGARD can estimate the single rectangular fault model and slip distribution along the assumed plate interface. The single rectangular fault model is useful as a first-order approximation of a medium-to-large earthquake. However, in its estimation, it is difficult to obtain accurate results for model parameters due to the strong effect of initial values. To solve this problem, this study proposes a new method to estimate the coseismic fault model and model uncertainties in real time based on the Bayesian inversion approach using the Markov Chain Monte Carlo (MCMC) method. The MCMC approach is computationally expensive and hyperparameters should be defined in advance via trial and error. The sampling efficiency was improved using a parallel tempering method, and an automatic definition method for hyperparameters was developed for real-time use. The calculation time was within 30 s for 1 × 106 samples using a typical single LINUX server, which can implement real-time analysis, similar to REGARD. The reliability of the developed method was evaluated using data from recent earthquakes (2016 Kumamoto and 2019 Yamagata-Oki earthquakes). Simulations of the earthquakes in the Sea of Japan were also conducted exhaustively. The results showed an advantage over the maximum likelihood approach with a priori information, which has initial value dependence in nonlinear problems. In terms of application to data with a small signal-to-noise ratio, the results suggest the possibility of using several conjugate fault models. There is a tradeoff between the fault area and slip amount, especially for offshore earthquakes, which means that quantification of the uncertainty enables us to evaluate the reliability of the fault model estimation results in real time.


Sign in / Sign up

Export Citation Format

Share Document