Exploring probabilistic seismic risk assessment accounting for seismicity clustering and damage accumulation: Part I. Hazard analysis

2020 ◽  
pp. 875529302095733
Author(s):  
Athanasios N Papadopoulos ◽  
Paolo Bazzurro ◽  
Warner Marzocchi

Probabilistic seismic hazard analysis (PSHA), as a tool to assess the probability that ground motion of a given intensity or larger is experienced at a given site and time span, has historically comprised the basis of both building design codes in earthquake-prone regions and seismic risk models. The PSHA traditionally refers solely to mainshock events and typically employs a homogeneous Poisson process to model their occurrence. Nevertheless, recent disasters, such as the 2010–2011 Christchurch sequence or the 2016 Central Italy earthquakes, to name a few, have highlighted the potential pitfalls of neglecting the occurrence of foreshocks, aftershocks, and other triggered events, and pinpointed the need to revisit the current practice. Herein, we employ the epidemic-type aftershock sequence (ETAS) model to describe seismicity in Central Italy, investigate the model’s capability to reproduce salient features of observed seismicity, and compare ETAS-derived one-year hazard estimates with ones obtained with a standard mainshock-only Poisson-based hazard model. A companion paper uses the hazard models derived herein to compare and contrast loss estimates for the residential exposure of Umbria in Central Italy.

2020 ◽  
pp. 875529302093881
Author(s):  
Athanasios N Papadopoulos ◽  
Paolo Bazzurro

In current practice, most earthquake risk models adopt a “declustered” view of seismicity, that is, they disregard foreshock, aftershock, and triggered earthquakes and model seismicity as a series of independent mainshock events, whose occurrence (typically) conforms to a Poisson process. This practice is certainly disputable but has been justified by the false notion that earthquakes of smaller magnitude than the mainshock cannot induce further damage than what was caused by the latter. A companion paper makes use of the epidemic-type aftershock sequence (ETAS) model fitted to Central Italy seismicity data to describe the full earthquake occurrence process, including “dependent” earthquakes. Herein, loss estimates for the region of Umbria in Central Italy are derived using stochastic event catalogs generated by means of the ETAS model and damage-dependent fragility functions to track damage accumulation. The results are then compared with estimates obtained with a conventional Poisson-based model. The potential gains of utilizing a model capable of capturing the spatiotemporal clustering features of seismicity are illustrated along with a discussion on the various details and challenges of such considerations.


2015 ◽  
Vol 57 (6) ◽  
Author(s):  
Maura Murru ◽  
Jiancang Zhuang ◽  
Rodolfo Console ◽  
Giuseppe Falcone

<div class="page" title="Page 1"><div class="layoutArea"><div class="column"><p>In this paper, we compare the forecasting performance of several statistical models, which are used to describe the occurrence process of earthquakes in forecasting the short-term earthquake probabilities during the L’Aquila earthquake sequence in central Italy in 2009. These models include the Proximity to Past Earthquakes (PPE) model and two versions of the Epidemic Type Aftershock Sequence (ETAS) model. We used the information gains corresponding to the Poisson and binomial scores to evaluate the performance of these models. It is shown that both ETAS models work better than the PPE model. However, in comparing the two types of ETAS models, the one with the same fixed exponent coefficient (<span>alpha)</span> = 2.3 for both the productivity function and the scaling factor in the spatial response function (ETAS I), performs better in forecasting the active aftershock sequence than the model with different exponent coefficients (ETAS II), when the Poisson score is adopted. ETAS II performs better when a lower magnitude threshold of 2.0 and the binomial score are used. The reason is found to be that the catalog does not have an event of similar magnitude to the L’Aquila mainshock (M<sub>w</sub> 6.3) in the training period (April 16, 2005 to March 15, 2009), and the (<span>alpha)</span>-value is underestimated, thus the forecast seismicity is underestimated when the productivity function is extrapolated to high magnitudes. We also investigate the effect of the inclusion of small events in forecasting larger events. These results suggest that the training catalog used for estimating the model parameters should include earthquakes of magnitudes similar to the mainshock when forecasting seismicity during an aftershock sequence.</p></div></div></div>


2016 ◽  
Vol 59 ◽  
Author(s):  
L. Peruzza ◽  
R. Gee ◽  
B. Pace ◽  
G. Roberts ◽  
O. Scotti ◽  
...  

<p>We perform aftershock probabilistic seismic hazard analysis (APSHA) of the ongoing aftershock sequence following the Amatrice August 24th, 2016 Central Italy earthquake. APSHA is a time-dependent PSHA calculation where earthquake occurrence rates decrease after the occurrence of a mainshock following an Omori-type decay. In this paper we propose a fault source model based on preliminary evidence of the complex fault geometry associated with the mainshock. We then explore the possibility that the aftershock seismicity is distributed either uniformly or non-uniformly across the fault source. The hazard results are then computed for short-intermediate exposure periods (1-3 months, 1 year). They are compared to the background hazard and intended to be useful for post-earthquake safety evaluation.</p>


2021 ◽  
Author(s):  
Christian Grimm ◽  
Martin Käser ◽  
Sebastian Hainzl ◽  
Marco Pagani ◽  
Helmut Küchenhoff

&lt;p&gt;Earthquake sequences add significant hazard beyond the solely declustered perspective of common probabilistic seismic hazard analysis (PSHA). A particularly strong driver for both social and economic losses are so-called earthquake doublets (more generally multiplets), i.e. sequences of two (or more) comparatively large events in spatial and temporal proximity. Not differentiating between foreshocks and aftershocks, we hypothesize three main drivers of doublet occurrence: (1) the number of direct aftershocks triggered by an earthquake; (2) the underlying, independent background seismicity in the same time-space window; and (3) the magnitude size distribution of triggered events (in contrast to independent events). We tested synthetic catalogs simulated by a common, isotropic epidemic type aftershock sequence (ETAS) model for both Japan and Southern California. Our findings show that the standard ETAS approach dramatically underestimates doublet frequencies compared to observations in historical catalogs. Among others, the results partially smooth out pronounced peaks of temporal and spatial event clustering. Focusing on the impact on direct aftershock productivity, we propose two modifications of the ETAS spatial kernel in order to improve doublet rate predictions: (a) a restriction of the spatial function to a maximum distance of 2.5 estimated rupture lengths; (b) an anisotropic function with contour lines constructed by a box with two semicircular ends around the estimated rupture line. The restriction of the spatial extent shifts triggering potential from weaker to stronger events and in consequence improves doublet rate predictions for larger events. However, this improvement goes at the cost of a weaker overall model fit according to AIC. The anisotropic models improve the overall model fit, but have minor impact on doublet occurrence rate predictions.&lt;/p&gt;


2019 ◽  
Vol 109 (6) ◽  
pp. 2356-2366 ◽  
Author(s):  
Ganyu Teng ◽  
Jack W. Baker

Abstract This study is an evaluation of the suitability of several declustering method for induced seismicity and their impacts on hazard analysis of the Oklahoma–Kansas region. We considered the methods proposed by Gardner and Knopoff (1974), Reasenberg (1985), Zaliapin and Ben‐Zion (2013), and the stochastic declustering method (Zhuang et al., 2002) based on the epidemic‐type aftershock sequence (ETAS) model (Ogata, 1988, 1998). The results show that the choice of declustering method has a significant impact on the declustered catalog and the resulting hazard analysis of the Oklahoma–Kansas region. The Gardner and Knopoff method, which is currently implemented in the U.S. Geological Survey one‐year seismic‐hazard forecast for the central and eastern United States, has unexpected features when used for this induced seismicity catalog. It removes 80% of earthquakes and fails to reflect the changes in background rates that have occurred in the past few years. This results in a slight increase in the hazard level from 2016 to 2017, despite a decrease in seismic activities in 2017. The Gardner and Knopoff method also frequently identifies aftershocks with much stronger shaking intensities than their associated mainshocks. These features are mostly due to the window method implemented in the Gardner and Knopoff method. Compared with the Gardner and Knopoff method, the other three methods are able to capture the changing hazard level in the region. However, the ETAS model potentially overestimates the foreshock effect and generates negligible probabilities of large earthquakes being mainshocks. The Reasenberg and Zaliapin and Ben‐Zion methods have similar performance on catalog declustering and hazard analysis. Compared with the ETAS method, these two methods are easier to implement and faster to generate the declustered catalog. The results from this study suggest that both Reasenberg and Zaliapin and Ben‐Zion declustering methods are suitable for declustering and hazard analysis for induced seismicity in the Oklahoma–Kansas region.


Author(s):  
Christian Grimm ◽  
Martin Käser ◽  
Sebastian Hainzl ◽  
Marco Pagani ◽  
Helmut Küchenhoff

ABSTRACT Earthquake sequences add a substantial hazard beyond the solely declustered perspective of common probabilistic seismic hazard analysis. A particularly strong driver for both social and economic losses are so-called earthquake doublets (more generally multiplets), that is, sequences of two (or more) comparatively large events in spatial and temporal proximity. Without differentiating between foreshocks and aftershocks, we hypothesize three main influencing factors of doublet occurrence: (1) the number of direct and secondary aftershocks triggered by an earthquake; (2) the occurrence of independent clusters and seismic background events in the same time–space window; and (3) the magnitude size distribution of triggered events (in contrast to independent events). We tested synthetic catalogs simulated by a standard epidemic-type aftershock sequence (ETAS) model for both Japan and southern California. Our findings show that the common ETAS approach significantly underestimates doublet frequencies compared with observations in historical catalogs. In combination with that the simulated catalogs show a smoother spatiotemporal clustering compared with the observed counterparts. Focusing on the impact on direct aftershock productivity and total cluster sizes, we propose two modifications of the ETAS spatial kernel to improve doublet rate predictions: (a) a restriction of the spatial function to a maximum distance of 2.5 estimated rupture lengths and (b) an anisotropic function with contour lines constructed by a box with two semicircular ends around the estimated rupture segment. These modifications shift the triggering potential from weaker to stronger events and consequently improve doublet rate predictions for larger events, despite still underestimating historic doublet occurrence rates. Besides, the results for the restricted spatial functions fulfill better the empirical Båth’s law for the maximum aftershock magnitude. The tested clustering properties of strong events are not sufficiently incorporated in typically used global catalog scale measures, such as log-likelihood values, which would favor the conventional, unrestricted models.


Author(s):  
Hernan Tesler-Mabe

As recently as one year ago, the European Union was seemingly on a direct path toward its avowed goal of "ever closer union." In numerous publications, EU authorities asserted that they had the confidence of European peoples desirous only of further integration. In the wake of the failed referenda for a European Constitution, however, enthusiasts of European Union can no longer be certain that their enterprise will succeed. The European Union, once strong and united, seems now an entity teetering on the edge of collapse. The reasons for such a dramatic shift are, of course, wide-ranging. Yet I would suggest that a great part of the general European disillusionment with European Union has come about as a result of the actions of the Europeanists themselves. Over the last decades, European officials have exhibited a frightfully high incidence of revisionism in their literature. This practice, I argue, has caused many Europeans to question the integrity of the project of European Union. For my presentation, I intend to undertake a close study of a selection of documents published by the European Communities. In this endeavour, I will compare and contrast the messages imparted in different editions of these works and consider the semiotic significance of the textual and non-textual language appearing therein. In this manner, I hope to achieve two aims. First, I mean to add a corrective element to a literature that, guided by a teleological interpretation of integration, endows integration with”logic" to be found only in hindsight. Second, I intend to examine the many meanings that the EU has had over its history and assess how closely policy has adhered to the ideological goals of prominent Europeanists. In sum, I hope to shed light on the fundamental disconnect between advocates of Europe and the "man on the street" and help establish a dialogue which may contribute to resolving the current impasse within the European Union. Full text available: https://doi.org/10.22215/rera.v2i4.178


2010 ◽  
Vol 27 (1-2) ◽  
Author(s):  
M. CAPUTO ◽  
V. I. KEILIS-BOROK ◽  
T. I. KRONROD ◽  
G. M. MOLCHAN ◽  
G. F. PANZA ◽  
...  

The estimation of seismic risk is made for three types of objects in the central Italy, considering three kinds of models: 1) - A(2I,g): the intensity of the Poisson's flow of earthquakes, M being the magnitude, g the liypocentre. 2) - I(g,g,M): giving the distribution on the surface for a single earthquake (g,M), g being the epicentre. 3) - x(g,I): giving the effect x of the shakings of intensity / , g being the position of the object. For actual decision-making additional computations may be necessary in order to estimate how our results are influenced by the errors in these models. However practical decision can be made on the basis of these data, because the experience shows that normally results are exagerated.


Agronomy ◽  
2021 ◽  
Vol 11 (8) ◽  
pp. 1643
Author(s):  
Davide Neri ◽  
Serena Polverigiani ◽  
Matteo Zucchini ◽  
Veronica Giorgi ◽  
Fabio Marchionni ◽  
...  

A living mulch system can provide beneficial biodiversified phytocoenoses and spatial competition against weeds; however, it may also compete for water with the main cultivated crop under Mediterranean climate conditions. Strawberries employed as living mulch in a rain-fed hill vineyard of central Italy were evaluated for two years through a participative approach involving the farmer. A local wild strawberry was propagated by stolons to obtain small plantlets easily uprooted after the summer and then transplanted to a one-year-old vineyard. The densities of two and four strawberry plants per grapevine were compared with no living mulch in a randomized complete block design. A horizontal blade weeder was used once a year in all treatments. The results showed that strawberries as living mulch plus application of a blade weeder avoided the need for further soil tillage and assured a full soil cover during winter for both initial planting densities. The strawberry living mulch did not alter the grapevine transpiration during an incident of water stress in summer. Moreover, the yield per vine and the grape quality were comparable with those of the soil without living mulch. The growth of strawberry mulch was relevant in the area surrounding the vines. Furthermore, the living mulch guaranteed a constant soil cover reducing the risk for soil erosion while increasing the vineyard’s biological diversity. This may imply a higher resilience.


Author(s):  
Edward H. Field ◽  
Kevin R. Milner ◽  
Nicolas Luco

ABSTRACT We use the Third Uniform California Earthquake Rupture Forecast (UCERF3) epidemic-type aftershock sequence (ETAS) model (UCERF3-ETAS) to evaluate the effects of declustering and Poisson assumptions on seismic hazard estimates. Although declustering is necessary to infer the long-term spatial distribution of earthquake rates, the question is whether it is also necessary to honor the Poisson assumption in classic probabilistic seismic hazard assessment. We use 500,000 yr, M ≥ 2.5 synthetic catalogs to address this question, for which UCERF3-ETAS exhibits realistic spatiotemporal clustering effects (e.g., aftershocks). We find that Gardner and Knopoff (1974) declustering, used in the U.S. Geological Survey seismic hazard models, lowers 2% in 50 yr and risk-targeted ground-motion hazard metrics by about 4% on average (compared with the full time-dependent [TD] model), with the reduction being 5% at 40% in 50 yr ground motions. Keeping all earthquakes and treating them as a Poisson process increases these same hazard metrics by about 3%–12%, on average, due to the removal of relatively quiet time periods in the full TD model. In the interest of model simplification, bias minimization, and consideration of the probabilities of multiple exceedances, we agree with others (Marzocchi and Taroni, 2014) that we are better off keeping aftershocks and treating them as a Poisson process rather than removing them from hazard consideration via declustering. Honoring the true time dependence, however, will likely be important for other hazard and risk metrics, and this study further exemplifies how this can now be evaluated more extensively.


Sign in / Sign up

Export Citation Format

Share Document