scholarly journals A New Smoothed Seismicity Approach to Include Aftershocks and Foreshocks in Spatial Earthquake Forecasting: Application to the Global Mw ≥ 5.5 Seismicity

2021 ◽  
Vol 11 (22) ◽  
pp. 10899
Author(s):  
Matteo Taroni ◽  
Aybige Akinci

Seismicity-based earthquake forecasting models have been primarily studied and developed over the past twenty years. These models mainly rely on seismicity catalogs as their data source and provide forecasts in time, space, and magnitude in a quantifiable manner. In this study, we presented a technique to better determine future earthquakes in space based on spatially smoothed seismicity. The improvement’s main objective is to use foreshock and aftershock events together with their mainshocks. Time-independent earthquake forecast models are often developed using declustered catalogs, where smaller-magnitude events regarding their mainshocks are removed from the catalog. Declustered catalogs are required in the probabilistic seismic hazard analysis (PSHA) to hold the Poisson assumption that the events are independent in time and space. However, as highlighted and presented by many recent studies, removing such events from seismic catalogs may lead to underestimating seismicity rates and, consequently, the final seismic hazard in terms of ground shaking. Our study also demonstrated that considering the complete catalog may improve future earthquakes’ spatial forecast. To do so, we adopted two different smoothed seismicity methods: (1) the fixed smoothing method, which uses spatially uniform smoothing parameters, and (2) the adaptive smoothing method, which relates an individual smoothing distance for each earthquake. The smoothed seismicity models are constructed by using the global earthquake catalog with Mw ≥ 5.5 events. We reported progress on comparing smoothed seismicity models developed by calculating and evaluating the joint log-likelihoods. Our resulting forecast shows a significant information gain concerning both fixed and adaptive smoothing model forecasts. Our findings indicate that complete catalogs are a notable feature for increasing the spatial variation skill of seismicity forecasts.

2022 ◽  
Author(s):  
Kirsty Bayliss ◽  
Mark Naylor ◽  
Farnaz Kamranzad ◽  
Ian Main

Abstract. Probabilistic earthquake forecasts estimate the likelihood of future earthquakes within a specified time-space-magnitude window and are important because they inform planning of hazard mitigation activities on different timescales. The spatial component of such forecasts, expressed as seismicity models, generally rely upon some combination of past event locations and underlying factors which might affect spatial intensity, such as strain rate, fault location and slip rate or past seismicity. For the first time, we extend previously reported spatial seismicity models, generated using the open source inlabru package, to time-independent earthquake forecasts using California as a case study. The inlabru approach allows the rapid evaluation of point process models which integrate different spatial datasets. We explore how well various candidate forecasts perform compared to observed activity over three contiguous five year time periods using the same training window for the seismicity data. In each case we compare models constructed from both full and declustered earthquake catalogues. In doing this, we compare the use of synthetic catalogue forecasts to the more widely-used grid-based approach of previous forecast testing experiments. The simulated-catalogue approach uses the full model posteriors to create Bayesian earthquake forecasts. We show that simulated-catalogue based forecasts perform better than the grid-based equivalents due to (a) their ability to capture more uncertainty in the model components and (b) the associated relaxation of the Poisson assumption in testing. We demonstrate that the inlabru models perform well overall over various time periods, and hence that independent data such as fault slip rates can improve forecasting power on the time scales examined. Together, these findings represent a significant improvement in earthquake forecasting is possible, though this has yet to be tested and proven in true prospective mode.


2020 ◽  
Vol 36 (1_suppl) ◽  
pp. 69-90 ◽  
Author(s):  
Teraphan Ornthammarath ◽  
Pennung Warnitchai ◽  
Chung-Han Chan ◽  
Yu Wang ◽  
Xuhua Shi ◽  
...  

We present an evaluation of the 2018 Northern Southeast Asia Seismic Hazard Model (NSAHM18) based on a combination of smoothed seismicity, subduction zone, and fault models. The smoothed seismicity is used to model observed distributed seismicity from largely unknown sources in the current study area. In addition, due to a short instrumental earthquake catalog, slip rate and characteristic earthquake magnitudes are incorporated through the fault model. To achieve this objective, the compiled earthquake catalogs and updated active fault databases in this region were reexamined with consistent use of these input parameters. To take into account epistemic uncertainty, logic tree analysis has been implemented incorporating basic quantities such as ground-motion models (GMMs) for three different tectonic regions (shallow active, subduction interface, and subduction intraslab), maximum magnitude, and earthquake magnitude frequency relationships. The seismic hazard results are presented in peak ground acceleration maps at 475- and 2475-year return periods.


2020 ◽  
Author(s):  
Pablo Iturrieta ◽  
Danijel Schorlemmer ◽  
Fabrice Cotton ◽  
José Bayona ◽  
Karina Loviknes

<p>In earthquake forecasting, smoothed-seismicity models (SSM) are based on the assumption that previous earthquakes serve as a guideline for future events. Different kernels are used to spatially extrapolate each moment tensor from a seismic catalog into a moment-rate density field. Nevertheless, governing mechanical principles remain absent through the model conception, even though crustal stress is responsible for moment release mainly in pre-existent faults. Furthermore, a lately developed SSM by Hiemer et al., 2013 (SEIFA) incorporates active-fault characterization and deformation rates stochastically, so that a geological estimate of moment release could also be taken into account. Motivated by this innovative approach, we address the question: How representative is the stochastic temporal/spatial averaging of SEIFA, of the long-term crustal deformation and stress? In this context, physics-based modeling provides insights about the energy, stress, and strain-rate fields within the crust due to discontinuities found therein. In this work, we aim to understand the required temporal window of SEIFA to satisfy mechanically its underlying assumption of stationarity. We build various SEIFA models within different spatio-temporal subsets of a catalog and confront them with a physics-based model of long-term seismic energy/moment rate. Following, we develop a method based on the moment-balance principle and information theory to compare the spatial similarity between these two types of models. These models are built from two spatially conforming layers of information: a complete seismic catalog and a computerized 3-D geometry of mapped faults along with their long-term slip rate. SEIFA uses both datasets to produce a moment-density rate field, from which later a forecast could be derived. A simple physics-based model is used as proof of concept, such as the steady-state Boundary Element Method (BEM). It uses the fault 3D geometry and slip rates to calculate the long-term interseismic energy rate and elastic stress and strain tensors, accumulated both along the faults and within the crust. The SHARE European Earthquake Catalog and the European Database of Seismogenic Faults are used as a case study, constrained to crustal faults and different spatio-temporal subsets of the Italy region in the 1000-2006 time window. The moment-balance principle is analyzed in terms of its spatial distribution calculating the spatial mutual information (SMI) between both models as a similarity measure. Finally, by using the SMI as a minimization function, we determine the catalog optimal time window for which the predicted moment rate by the SSM is closer to the geomechanical prediction. We emphasize that regardless of the stationarity assumption usefulness in seismicity forecasting, we determine a simple method that provides a physical boundary to data-driven seismicity models. This framework may be used in the future to combine seismicity data and geophysical modeling for earthquake forecasting.</p>


2020 ◽  
Author(s):  
Matteo Taroni ◽  
Aybige Akinci

<p>In this study we present five- and ten-year time-independent forecast of M≥5.0 earthquakes in Italy using only seismicity data, without any tectonic, geologic, or geodetic information. Spatially-varying earthquake occurrence rates are calculated using an adaptive smoothing kernel (Helmstetter et al., 2007) that defines a unique smoothing distance for each earthquake epicenter from the distance to the n-th nearest neighbor, optimized through the Collaboratory for the Study of Earthquake Predictability (CSEP) testing type likelihood methodology (Werner et al.,2007). We modify that adaptive smoothing method to include all earthquakes in the catalog (foreshocks, aftershocks and the events below the completeness magnitude) multiplying each smoothing kernel by a proper scaling factor that varies as function of completeness magnitude and the number of events in each seismic cluster. Our smoothing philosophy relies on the usefulness of all earthquakes, including also those with smaller magnitudes, in forecasting the future seismicity.<br>The smoothed seismicity Italian model, that provides the forecasted seismicity rates as an expected number of M≥5.0 events per year in each grid cell, 0.1°x0.1°, is constructed by using the complete instrumental catalog, spanning from 1960 to 2019 with a completeness magnitude that decreases with time (from M4.0 to 1.8). Finally, we compare our model with the real observations and with the Italian CSEP experiment models, to check their relative performances, using the official CSEP tests (Taroni et al., 2018). In the present study, the probabilities of occurrence of future large earthquakes in the next 5 and 10 years are calculated based on the assumption that earthquake processes have no memory, i.e., the occurrence of a future earthquake is independent of the occurrence of previous earthquakes from the same source (time-independent model).</p>


2020 ◽  
Vol 224 (2) ◽  
pp. 1174-1187
Author(s):  
Matteo Taroni ◽  
Aybige Akinci

SUMMARY The classical procedure of the probabilistic seismic hazard analysis (PSHA) requires a Poissonian distribution of earthquakes. Seismic catalogues follow a Poisson distribution just after the application of a declustering algorithm that leaves only one earthquake for each seismic sequence (usually the stronger, i.e. the main shock). Removing earthquakes from the seismic catalogues leads to underestimation of the annual rates of the events and consequently associate with low seismic hazard as indicated by several studies. In this study, we aim investigating the performance of two declustering methods on the Italian instrumental catalogue and the impact of declustering on estimation of the b-value and on the seismic hazard analysis. To this end, first the spatial variation in the seismicity rate was estimated from the declustered catalogues using the adaptive smoothed seismicity approach, considering small earthquakes (Mw ≥ 3.0). We then corrected the seismicity rates using new approach that allows for counting all events in the complete seismic catalogue by simply changing the magnitude frequency distribution. The impact of declustering on seismic hazard analysis is illustrated using PSHA maps in terms of peak ground acceleration and spectral acceleration in 2 s, with 10 per cent and 2 per cent probability of exceedance in 50 yr, for Italy. We observed that the hazard calculated from the declustered catalogues was always lower than the hazard computed using the complete catalogue. These results are in agreement with previous results obtained in different parts of the world.


2021 ◽  
Vol 64 (2) ◽  
Author(s):  
Francesco Visini ◽  
Bruno Pace ◽  
Carlo Meletti ◽  
Warner Marzocchi ◽  
Aybige Akinci ◽  
...  

In recent years, new approaches for developing earthquake rupture forecasts (ERFs) have been proposed to be used as an input for probabilistic seismic hazard assessment (PSHA). Zone- based approaches with seismicity rates derived from earthquake catalogs are commonly used in many countries as the standard for national seismic hazard models. In Italy, a single zone- based ERF is currently the basis for the official seismic hazard model. In this contribution, we present eleven new ERFs, including five zone-based, two smoothed seismicity-based, two fault- based, and two geodetic-based, used for a new PSH model in Italy. The ERFs were tested against observed seismicity and were subject to an elicitation procedure by a panel of PSHA experts to verify the scientific robustness and consistency of the forecasts with respect to the observations. Tests and elicitation were finalized to weight the ERFs. The results show a good response to the new inputs to observed seismicity in the last few centuries. The entire approach was a first attempt to build a community-based set of ERFs for an Italian PSHA model. The project involved a large number of seismic hazard practitioners, with their knowledge and experience, and the development of different models to capture and explore a large range of epistemic uncertainties in building ERFs, and represents an important step forward for the new national seismic hazard model.


2019 ◽  
Vol 109 (5) ◽  
pp. 2036-2049 ◽  
Author(s):  
José Antonio Bayona Viveros ◽  
Sebastian von Specht ◽  
Anne Strader ◽  
Sebastian Hainzl ◽  
Fabrice Cotton ◽  
...  

Abstract The Seismic Hazard Inferred from Tectonics based on the Global Strain Rate Map (SHIFT_GSRM) earthquake forecast was designed to provide high‐resolution estimates of global shallow seismicity to be used in seismic hazard assessment. This model combines geodetic strain rates with global earthquake parameters to characterize long‐term rates of seismic moment and earthquake activity. Although SHIFT_GSRM properly computes seismicity rates in seismically active continental regions, it underestimates earthquake rates in subduction zones by an average factor of approximately 3. We present a complementary method to SHIFT_GSRM to more accurately forecast earthquake rates in 37 subduction segments, based on the conservation of moment principle and the use of regional interface seismicity parameters, such as subduction dip angles, corner magnitudes, and coupled seismogenic thicknesses. In seven progressive steps, we find that SHIFT_GSRM earthquake‐rate underpredictions are mainly due to the utilization of a global probability function of seismic moment release that poorly captures the great variability among subduction megathrust interfaces. Retrospective test results show that the forecast is consistent with the observations during the 1 January 1977 to 31 December 2014 period. Moreover, successful pseudoprospective evaluations for the 1 January 2015 to 31 December 2018 period demonstrate the power of the regionalized earthquake model to properly estimate subduction‐zone seismicity.


1992 ◽  
Vol 82 (1) ◽  
pp. 104-119
Author(s):  
Michéle Lamarre ◽  
Brent Townshend ◽  
Haresh C. Shah

Abstract This paper describes a methodology to assess the uncertainty in seismic hazard estimates at particular sites. A variant of the bootstrap statistical method is used to combine the uncertainty due to earthquake catalog incompleteness, earthquake magnitude, and recurrence and attenuation models used. The uncertainty measure is provided in the form of a confidence interval. Comparisons of this method applied to various sites in California with previous studies are used to confirm the validity of the method.


Sign in / Sign up

Export Citation Format

Share Document