smoothed seismicity
Recently Published Documents


TOTAL DOCUMENTS

31
(FIVE YEARS 14)

H-INDEX

8
(FIVE YEARS 1)

2021 ◽  
Vol 11 (22) ◽  
pp. 10899
Author(s):  
Matteo Taroni ◽  
Aybige Akinci

Seismicity-based earthquake forecasting models have been primarily studied and developed over the past twenty years. These models mainly rely on seismicity catalogs as their data source and provide forecasts in time, space, and magnitude in a quantifiable manner. In this study, we presented a technique to better determine future earthquakes in space based on spatially smoothed seismicity. The improvement’s main objective is to use foreshock and aftershock events together with their mainshocks. Time-independent earthquake forecast models are often developed using declustered catalogs, where smaller-magnitude events regarding their mainshocks are removed from the catalog. Declustered catalogs are required in the probabilistic seismic hazard analysis (PSHA) to hold the Poisson assumption that the events are independent in time and space. However, as highlighted and presented by many recent studies, removing such events from seismic catalogs may lead to underestimating seismicity rates and, consequently, the final seismic hazard in terms of ground shaking. Our study also demonstrated that considering the complete catalog may improve future earthquakes’ spatial forecast. To do so, we adopted two different smoothed seismicity methods: (1) the fixed smoothing method, which uses spatially uniform smoothing parameters, and (2) the adaptive smoothing method, which relates an individual smoothing distance for each earthquake. The smoothed seismicity models are constructed by using the global earthquake catalog with Mw ≥ 5.5 events. We reported progress on comparing smoothed seismicity models developed by calculating and evaluating the joint log-likelihoods. Our resulting forecast shows a significant information gain concerning both fixed and adaptive smoothing model forecasts. Our findings indicate that complete catalogs are a notable feature for increasing the spatial variation skill of seismicity forecasts.


2021 ◽  
Vol 64 (2) ◽  
Author(s):  
Francesco Visini ◽  
Bruno Pace ◽  
Carlo Meletti ◽  
Warner Marzocchi ◽  
Aybige Akinci ◽  
...  

In recent years, new approaches for developing earthquake rupture forecasts (ERFs) have been proposed to be used as an input for probabilistic seismic hazard assessment (PSHA). Zone- based approaches with seismicity rates derived from earthquake catalogs are commonly used in many countries as the standard for national seismic hazard models. In Italy, a single zone- based ERF is currently the basis for the official seismic hazard model. In this contribution, we present eleven new ERFs, including five zone-based, two smoothed seismicity-based, two fault- based, and two geodetic-based, used for a new PSH model in Italy. The ERFs were tested against observed seismicity and were subject to an elicitation procedure by a panel of PSHA experts to verify the scientific robustness and consistency of the forecasts with respect to the observations. Tests and elicitation were finalized to weight the ERFs. The results show a good response to the new inputs to observed seismicity in the last few centuries. The entire approach was a first attempt to build a community-based set of ERFs for an Italian PSHA model. The project involved a large number of seismic hazard practitioners, with their knowledge and experience, and the development of different models to capture and explore a large range of epistemic uncertainties in building ERFs, and represents an important step forward for the new national seismic hazard model.


2020 ◽  
Vol 63 (6) ◽  
Author(s):  
Maura Murru ◽  
Giuseppe Falcone ◽  
Matteo Taroni ◽  
Rodolfo Console

We develop an ensemble earthquake rate model that provides spatially variable time-independent (Poisson) long-term annual occurrence rates of seismic events throughout Italy, for magnitude bin of 0.1 units from Mw ≥ 4.5 in spatial cells of 0.1° × 0.1°. We weighed seismic activity rates of smoothed seismicity and fault-based inputs to build our earthquake rupture forecast model, merging it into a single ensemble model. Both inputs adopt a tapered Gutenberg-Richter relation with a single b-value and a single corner magnitude estimated by earthquakes catalog. The spatial smoothed seismicity was obtained using the classical kernel smoothing method with the inclusion of magnitude dependent completeness periods applied to the Historical (CPTI15) and Instrumental seismic catalogs. For each seismogenic source provided by the Database of the Individual Seismogenic Sources (DISS), we computed the annual rate of the events above Mw 4.5, assuming that the seismic moments of the earthquakes generated by each fault are distributed according to the tapered Gutenberg-Richter relation with the same parameters of the smoothed seismicity models. Comparing seismic annual rates of the catalogs with those of the seismogenic sources, we realized that there is a good agreement between these rates in Central Apennines zones, whereas the seismogenic rates are higher than those of the catalogs in the north east and south of Italy. We also tested our model against the strong Italian earthquakes (Mw 5.5+), in order to check if the total number (N-test) and the spatial distribution (S-test) of these events was compatible with our model, obtaining good results, i.e. high p-values in the test. The final model will be a branch of the new Italian seismic hazard map.  


2020 ◽  
Vol 224 (2) ◽  
pp. 1174-1187
Author(s):  
Matteo Taroni ◽  
Aybige Akinci

SUMMARY The classical procedure of the probabilistic seismic hazard analysis (PSHA) requires a Poissonian distribution of earthquakes. Seismic catalogues follow a Poisson distribution just after the application of a declustering algorithm that leaves only one earthquake for each seismic sequence (usually the stronger, i.e. the main shock). Removing earthquakes from the seismic catalogues leads to underestimation of the annual rates of the events and consequently associate with low seismic hazard as indicated by several studies. In this study, we aim investigating the performance of two declustering methods on the Italian instrumental catalogue and the impact of declustering on estimation of the b-value and on the seismic hazard analysis. To this end, first the spatial variation in the seismicity rate was estimated from the declustered catalogues using the adaptive smoothed seismicity approach, considering small earthquakes (Mw ≥ 3.0). We then corrected the seismicity rates using new approach that allows for counting all events in the complete seismic catalogue by simply changing the magnitude frequency distribution. The impact of declustering on seismic hazard analysis is illustrated using PSHA maps in terms of peak ground acceleration and spectral acceleration in 2 s, with 10 per cent and 2 per cent probability of exceedance in 50 yr, for Italy. We observed that the hazard calculated from the declustered catalogues was always lower than the hazard computed using the complete catalogue. These results are in agreement with previous results obtained in different parts of the world.


2020 ◽  
Vol 36 (1_suppl) ◽  
pp. 69-90 ◽  
Author(s):  
Teraphan Ornthammarath ◽  
Pennung Warnitchai ◽  
Chung-Han Chan ◽  
Yu Wang ◽  
Xuhua Shi ◽  
...  

We present an evaluation of the 2018 Northern Southeast Asia Seismic Hazard Model (NSAHM18) based on a combination of smoothed seismicity, subduction zone, and fault models. The smoothed seismicity is used to model observed distributed seismicity from largely unknown sources in the current study area. In addition, due to a short instrumental earthquake catalog, slip rate and characteristic earthquake magnitudes are incorporated through the fault model. To achieve this objective, the compiled earthquake catalogs and updated active fault databases in this region were reexamined with consistent use of these input parameters. To take into account epistemic uncertainty, logic tree analysis has been implemented incorporating basic quantities such as ground-motion models (GMMs) for three different tectonic regions (shallow active, subduction interface, and subduction intraslab), maximum magnitude, and earthquake magnitude frequency relationships. The seismic hazard results are presented in peak ground acceleration maps at 475- and 2475-year return periods.


2020 ◽  
Author(s):  
Pablo Iturrieta ◽  
Danijel Schorlemmer ◽  
Fabrice Cotton ◽  
José Bayona ◽  
Karina Loviknes

<p>In earthquake forecasting, smoothed-seismicity models (SSM) are based on the assumption that previous earthquakes serve as a guideline for future events. Different kernels are used to spatially extrapolate each moment tensor from a seismic catalog into a moment-rate density field. Nevertheless, governing mechanical principles remain absent through the model conception, even though crustal stress is responsible for moment release mainly in pre-existent faults. Furthermore, a lately developed SSM by Hiemer et al., 2013 (SEIFA) incorporates active-fault characterization and deformation rates stochastically, so that a geological estimate of moment release could also be taken into account. Motivated by this innovative approach, we address the question: How representative is the stochastic temporal/spatial averaging of SEIFA, of the long-term crustal deformation and stress? In this context, physics-based modeling provides insights about the energy, stress, and strain-rate fields within the crust due to discontinuities found therein. In this work, we aim to understand the required temporal window of SEIFA to satisfy mechanically its underlying assumption of stationarity. We build various SEIFA models within different spatio-temporal subsets of a catalog and confront them with a physics-based model of long-term seismic energy/moment rate. Following, we develop a method based on the moment-balance principle and information theory to compare the spatial similarity between these two types of models. These models are built from two spatially conforming layers of information: a complete seismic catalog and a computerized 3-D geometry of mapped faults along with their long-term slip rate. SEIFA uses both datasets to produce a moment-density rate field, from which later a forecast could be derived. A simple physics-based model is used as proof of concept, such as the steady-state Boundary Element Method (BEM). It uses the fault 3D geometry and slip rates to calculate the long-term interseismic energy rate and elastic stress and strain tensors, accumulated both along the faults and within the crust. The SHARE European Earthquake Catalog and the European Database of Seismogenic Faults are used as a case study, constrained to crustal faults and different spatio-temporal subsets of the Italy region in the 1000-2006 time window. The moment-balance principle is analyzed in terms of its spatial distribution calculating the spatial mutual information (SMI) between both models as a similarity measure. Finally, by using the SMI as a minimization function, we determine the catalog optimal time window for which the predicted moment rate by the SSM is closer to the geomechanical prediction. We emphasize that regardless of the stationarity assumption usefulness in seismicity forecasting, we determine a simple method that provides a physical boundary to data-driven seismicity models. This framework may be used in the future to combine seismicity data and geophysical modeling for earthquake forecasting.</p>


2020 ◽  
Author(s):  
Matteo Taroni ◽  
Aybige Akinci

<p>In this study we present five- and ten-year time-independent forecast of M≥5.0 earthquakes in Italy using only seismicity data, without any tectonic, geologic, or geodetic information. Spatially-varying earthquake occurrence rates are calculated using an adaptive smoothing kernel (Helmstetter et al., 2007) that defines a unique smoothing distance for each earthquake epicenter from the distance to the n-th nearest neighbor, optimized through the Collaboratory for the Study of Earthquake Predictability (CSEP) testing type likelihood methodology (Werner et al.,2007). We modify that adaptive smoothing method to include all earthquakes in the catalog (foreshocks, aftershocks and the events below the completeness magnitude) multiplying each smoothing kernel by a proper scaling factor that varies as function of completeness magnitude and the number of events in each seismic cluster. Our smoothing philosophy relies on the usefulness of all earthquakes, including also those with smaller magnitudes, in forecasting the future seismicity.<br>The smoothed seismicity Italian model, that provides the forecasted seismicity rates as an expected number of M≥5.0 events per year in each grid cell, 0.1°x0.1°, is constructed by using the complete instrumental catalog, spanning from 1960 to 2019 with a completeness magnitude that decreases with time (from M4.0 to 1.8). Finally, we compare our model with the real observations and with the Italian CSEP experiment models, to check their relative performances, using the official CSEP tests (Taroni et al., 2018). In the present study, the probabilities of occurrence of future large earthquakes in the next 5 and 10 years are calculated based on the assumption that earthquake processes have no memory, i.e., the occurrence of a future earthquake is independent of the occurrence of previous earthquakes from the same source (time-independent model).</p>


2020 ◽  
Author(s):  
Stefan Weginger ◽  
Papí Isaba María del Puy ◽  
Yan Jia ◽  
Wolfgang Lenhardt

<p>After 25 years, a new seismic hazard map for Austria was created. The improvements in the Probabilistic Seismic Hazard Assessment (PSHA) are based on expanded and updated catalog data with improved depth, source-mechanism and moment magnitudes. Locally adapted ground motion prediction equations (GMPE) were calculated by applying a least square adjustment to the local measurements. A neuronal networks approach was successfully tested. The final selection is carried out by using statistical parameters, like Log-Likelihood and Euclidean Distance Range. Verified calculation methods, like Bayesian Penalized Maximum Likelihood and modified Gutenberg Richter, were used. The uncertainties have been considered by using the covariance matrix according to Stromeyer (2015). The PSHA approach combines a model of seismic zones (area sources), which is composed of zones and superzones, a zone-free model (smoothed seismicity) and a model with geological fault zones. A logic tree function was used to merge the models, the maximum magnitudes (by EPRI-Approach) and the GMPE. The calculations were carried out with the Openquake software framework. The results were compared with the current norm and the results of neighboring countries. Furthermore, the uniform hazard spectra were compared with the new Eurocode draft.</p>


Sign in / Sign up

Export Citation Format

Share Document