Improving earthquake doublet rate predictions in ETAS by using modified spatial trigger distributions

Author(s):  
Christian Grimm ◽  
Martin Käser ◽  
Sebastian Hainzl ◽  
Marco Pagani ◽  
Helmut Küchenhoff

<p>Earthquake sequences add significant hazard beyond the solely declustered perspective of common probabilistic seismic hazard analysis (PSHA). A particularly strong driver for both social and economic losses are so-called earthquake doublets (more generally multiplets), i.e. sequences of two (or more) comparatively large events in spatial and temporal proximity. Not differentiating between foreshocks and aftershocks, we hypothesize three main drivers of doublet occurrence: (1) the number of direct aftershocks triggered by an earthquake; (2) the underlying, independent background seismicity in the same time-space window; and (3) the magnitude size distribution of triggered events (in contrast to independent events). We tested synthetic catalogs simulated by a common, isotropic epidemic type aftershock sequence (ETAS) model for both Japan and Southern California. Our findings show that the standard ETAS approach dramatically underestimates doublet frequencies compared to observations in historical catalogs. Among others, the results partially smooth out pronounced peaks of temporal and spatial event clustering. Focusing on the impact on direct aftershock productivity, we propose two modifications of the ETAS spatial kernel in order to improve doublet rate predictions: (a) a restriction of the spatial function to a maximum distance of 2.5 estimated rupture lengths; (b) an anisotropic function with contour lines constructed by a box with two semicircular ends around the estimated rupture line. The restriction of the spatial extent shifts triggering potential from weaker to stronger events and in consequence improves doublet rate predictions for larger events. However, this improvement goes at the cost of a weaker overall model fit according to AIC. The anisotropic models improve the overall model fit, but have minor impact on doublet occurrence rate predictions.</p>

Author(s):  
Christian Grimm ◽  
Martin Käser ◽  
Sebastian Hainzl ◽  
Marco Pagani ◽  
Helmut Küchenhoff

ABSTRACT Earthquake sequences add a substantial hazard beyond the solely declustered perspective of common probabilistic seismic hazard analysis. A particularly strong driver for both social and economic losses are so-called earthquake doublets (more generally multiplets), that is, sequences of two (or more) comparatively large events in spatial and temporal proximity. Without differentiating between foreshocks and aftershocks, we hypothesize three main influencing factors of doublet occurrence: (1) the number of direct and secondary aftershocks triggered by an earthquake; (2) the occurrence of independent clusters and seismic background events in the same time–space window; and (3) the magnitude size distribution of triggered events (in contrast to independent events). We tested synthetic catalogs simulated by a standard epidemic-type aftershock sequence (ETAS) model for both Japan and southern California. Our findings show that the common ETAS approach significantly underestimates doublet frequencies compared with observations in historical catalogs. In combination with that the simulated catalogs show a smoother spatiotemporal clustering compared with the observed counterparts. Focusing on the impact on direct aftershock productivity and total cluster sizes, we propose two modifications of the ETAS spatial kernel to improve doublet rate predictions: (a) a restriction of the spatial function to a maximum distance of 2.5 estimated rupture lengths and (b) an anisotropic function with contour lines constructed by a box with two semicircular ends around the estimated rupture segment. These modifications shift the triggering potential from weaker to stronger events and consequently improve doublet rate predictions for larger events, despite still underestimating historic doublet occurrence rates. Besides, the results for the restricted spatial functions fulfill better the empirical Båth’s law for the maximum aftershock magnitude. The tested clustering properties of strong events are not sufficiently incorporated in typically used global catalog scale measures, such as log-likelihood values, which would favor the conventional, unrestricted models.


2017 ◽  
Vol 43 (4) ◽  
pp. 1994
Author(s):  
A.C. Astiopoulos ◽  
E. Papadimitriou ◽  
V. Karakostas ◽  
D. Gospodinov ◽  
G. Drakatos

The statistical properties of the aftershock occurrence are among the main issues in investigating the earthquake generation process. Seismicity rate changes during a seismic sequence, which are detected by the application of statistical models, are proved to be precursors of strong events occurring during the seismic excitation. Application of these models provides a tool in assessing the imminent seismic hazard, oftentimes by the estimation of the expected occurrence rate and comparison of the predicted rate with the observed one. The aim of this study is to examine the temporal distribution and especially the occurrence rate variations of aftershocks for two seismic sequences that took place, the first one near Skyros island in 2001 and the second one near Lefkada island in 2003, in order to detect and determine rate changes in connection with the evolution of the seismic activity. Analysis is performed through space–time stochastic models which are developed, based upon both aftershocks clustering studies and specific assumptions. The models applied are the Modified Omori Formula (MOF), the Epidemic Type Aftershock Sequence (ETAS) and the Restricted Epidemic Type Aftershock Sequence (RETAS). The modelling of seismicity rate changes, during the evolution of the particular seismic sequences, is then attempted in association with and as evidence of static stress changes


2020 ◽  
pp. 875529302095733
Author(s):  
Athanasios N Papadopoulos ◽  
Paolo Bazzurro ◽  
Warner Marzocchi

Probabilistic seismic hazard analysis (PSHA), as a tool to assess the probability that ground motion of a given intensity or larger is experienced at a given site and time span, has historically comprised the basis of both building design codes in earthquake-prone regions and seismic risk models. The PSHA traditionally refers solely to mainshock events and typically employs a homogeneous Poisson process to model their occurrence. Nevertheless, recent disasters, such as the 2010–2011 Christchurch sequence or the 2016 Central Italy earthquakes, to name a few, have highlighted the potential pitfalls of neglecting the occurrence of foreshocks, aftershocks, and other triggered events, and pinpointed the need to revisit the current practice. Herein, we employ the epidemic-type aftershock sequence (ETAS) model to describe seismicity in Central Italy, investigate the model’s capability to reproduce salient features of observed seismicity, and compare ETAS-derived one-year hazard estimates with ones obtained with a standard mainshock-only Poisson-based hazard model. A companion paper uses the hazard models derived herein to compare and contrast loss estimates for the residential exposure of Umbria in Central Italy.


Author(s):  
Leila Mizrahi ◽  
Shyam Nandan ◽  
Stefan Wiemer

Abstract Declustering aims to divide earthquake catalogs into independent events (mainshocks), and dependent (clustered) events, and is an integral component of many seismicity studies, including seismic hazard assessment. We assess the effect of declustering on the frequency–magnitude distribution of mainshocks. In particular, we examine the dependence of the b-value of declustered catalogs on the choice of declustering approach and algorithm-specific parameters. Using the catalog of earthquakes in California since 1980, we show that the b-value decreases by up to 30% due to declustering with respect to the undeclustered catalog. The extent of the reduction is highly dependent on the declustering method and parameters applied. We then reproduce a similar effect by declustering synthetic earthquake catalogs with known b-value, which have been generated using an epidemic-type aftershock sequence model. Our analysis suggests that the observed decrease in b-value must, at least partially, arise from the application of the declustering algorithm on the catalog, rather than from differences in the nature of mainshocks versus fore- or aftershocks. We conclude that declustering should be considered as a potential source of bias in seismicity and hazard studies.


2019 ◽  
Vol 109 (6) ◽  
pp. 2356-2366 ◽  
Author(s):  
Ganyu Teng ◽  
Jack W. Baker

Abstract This study is an evaluation of the suitability of several declustering method for induced seismicity and their impacts on hazard analysis of the Oklahoma–Kansas region. We considered the methods proposed by Gardner and Knopoff (1974), Reasenberg (1985), Zaliapin and Ben‐Zion (2013), and the stochastic declustering method (Zhuang et al., 2002) based on the epidemic‐type aftershock sequence (ETAS) model (Ogata, 1988, 1998). The results show that the choice of declustering method has a significant impact on the declustered catalog and the resulting hazard analysis of the Oklahoma–Kansas region. The Gardner and Knopoff method, which is currently implemented in the U.S. Geological Survey one‐year seismic‐hazard forecast for the central and eastern United States, has unexpected features when used for this induced seismicity catalog. It removes 80% of earthquakes and fails to reflect the changes in background rates that have occurred in the past few years. This results in a slight increase in the hazard level from 2016 to 2017, despite a decrease in seismic activities in 2017. The Gardner and Knopoff method also frequently identifies aftershocks with much stronger shaking intensities than their associated mainshocks. These features are mostly due to the window method implemented in the Gardner and Knopoff method. Compared with the Gardner and Knopoff method, the other three methods are able to capture the changing hazard level in the region. However, the ETAS model potentially overestimates the foreshock effect and generates negligible probabilities of large earthquakes being mainshocks. The Reasenberg and Zaliapin and Ben‐Zion methods have similar performance on catalog declustering and hazard analysis. Compared with the ETAS method, these two methods are easier to implement and faster to generate the declustered catalog. The results from this study suggest that both Reasenberg and Zaliapin and Ben‐Zion declustering methods are suitable for declustering and hazard analysis for induced seismicity in the Oklahoma–Kansas region.


2012 ◽  
Vol 82 (3) ◽  
pp. 216-222 ◽  
Author(s):  
Venkatesh Iyengar ◽  
Ibrahim Elmadfa

The food safety security (FSS) concept is perceived as an early warning system for minimizing food safety (FS) breaches, and it functions in conjunction with existing FS measures. Essentially, the function of FS and FSS measures can be visualized in two parts: (i) the FS preventive measures as actions taken at the stem level, and (ii) the FSS interventions as actions taken at the root level, to enhance the impact of the implemented safety steps. In practice, along with FS, FSS also draws its support from (i) legislative directives and regulatory measures for enforcing verifiable, timely, and effective compliance; (ii) measurement systems in place for sustained quality assurance; and (iii) shared responsibility to ensure cohesion among all the stakeholders namely, policy makers, regulators, food producers, processors and distributors, and consumers. However, the functional framework of FSS differs from that of FS by way of: (i) retooling the vulnerable segments of the preventive features of existing FS measures; (ii) fine-tuning response systems to efficiently preempt the FS breaches; (iii) building a long-term nutrient and toxicant surveillance network based on validated measurement systems functioning in real time; (iv) focusing on crisp, clear, and correct communication that resonates among all the stakeholders; and (v) developing inter-disciplinary human resources to meet ever-increasing FS challenges. Important determinants of FSS include: (i) strengthening international dialogue for refining regulatory reforms and addressing emerging risks; (ii) developing innovative and strategic action points for intervention {in addition to Hazard Analysis and Critical Control Points (HACCP) procedures]; and (iii) introducing additional science-based tools such as metrology-based measurement systems.


2000 ◽  
Vol 151 (12) ◽  
pp. 502-507
Author(s):  
Christian Küchli

Are there any common patterns in the transition processes from traditional and more or less sustainable forest management to exploitative use, which can regularly be observed both in central Europe and in the countries of the South (e.g. India or Indonesia)? Attempts were made with a time-space-model to typify those force fields, in which traditional sustainable forest management is undermined and is then transformed into a modern type of sustainable forest management. Although it is unlikely that the history of the North will become the future of the South, the glimpse into the northern past offers a useful starting point for the understanding of the current situation in the South, which in turn could stimulate the debate on development. For instance, the patterns which stand behind the conflicts on forest use in the Himalayas are very similar to the conflicts in the Alps. In the same way, the impact of socio-economic changes on the environment – key word ‹globalisation› – is often much the same. To recognize comparable patterns can be very valuable because it can act as a stimulant for the search of political, legal and technical solutions adapted to a specific situation. For the global community the realization of the way political-economic alliances work at the head of the ‹globalisationwave›can only signify to carry on trying to find a common language and understanding at the negotiation tables. On the lee side of the destructive breaker it is necessary to conserve and care for what survived. As it was the case in Switzerland these forest islands could once become the germination points for the genesis of a cultural landscape, where close-to-nature managed forests will constitute an essential element.


2020 ◽  
Vol 1 (11) ◽  
pp. 133-140
Author(s):  
E. V. DMITRIEVA ◽  

The article considers topical issues of economic support for the development of the regional security system of the population against various risks. The dependence of the impact of the scale of crisis situations on economic activities in the constituent entities of the Russian Federation, which become a serious barrier to the sustainable development of the regions of the country, was investigated. The increasing importance of risks of economic losses from accidents and disasters at potentially dangerous facilities as a result of the complex influence of natural, manmade and fire factors has been established. An analysis was carried out and proposals were developed to implement the key tasks of the state in the field of ensuring the protection of the population and territories of the country from disasters in order to ensure the stability of the economy. The organizational structure, division of tasks and functions between officials, crisis management structures and responding units were analyzed, taking into account the reduction in current financial costs. On the basis of a study of the peculiarities of the regions of the country, recommendations were formed to fulfill the necessary tasks by the anti-crisis management bodies in the field of reducing economic damage on the basis of preventing crisis situations and ensuring fire safety. It is proposed to organize the practical application of a complex automated security system based on modern developments with the application of improving the qualities and efficiency of anti-crisis management processes in order to increase economic efficiency. Initial data were formed to reduce potential threats of a natural, man-made, fire and other nature in the regions using financial and economic mechanisms. It is proposed to implement a set of priority measures to further improve and increase the potential of economic support for the anti-crisis management system. The materials of the article can be used in planning the main directions of the development of the regional population security system and the implementation of socio-economic development programs.


Author(s):  
Cicilia S. B. Kambey ◽  
Iona Campbell ◽  
Elizabeth J. Cottier-Cook ◽  
Adibi R. M. Nor ◽  
Azhar Kassim ◽  
...  

AbstractThe application of biosecurity in seaweed aquaculture plays an important role in reducing the impact of disease and pest outbreaks. The continuous occurrence of seaweed pests including the macroalgal epiphytes, epi-endophytic filamentous algae and biofilms on Kappaphycus farms may also potentially induce further incidences of the ice-ice syndrome. In this study, on-farm biosecurity management measures were tested on the commercially grown seaweeds Kappaphycus malesianus and Kappaphycus alvarezii during peak ice-ice season at Gallam-Gallam Village, Sabah, Malaysia. The investigation was focused on preventative control measures including the early detection of the ice-ice syndrome and pests through propagule health checks, regular cleaning of the crop thallus and associated long-line ropes and monitoring of the environment. Farm procedures and practices were also assessed in terms of their biosecurity ‘risk’ using the hazard analysis and critical control point (HCCAP) approach. Observations were replicated in two different farm management systems; one system adopted routine biosecurity measures and the other had no biosecurity measures. The results showed that the ice-ice syndrome and pest outbreak was significantly decreased by 60–75% for K. malesianus and 29–71% for K. alvarezii at the farm which adopted the routine biosecurity measures compared with the no biosecurity treatment. The biosecurity measures also significantly improved growth rate and seaweed quality. The infection levels of the epi-endophyte Melanothamnus sp. contributed to the ice-ice syndrome in K. malesianus, whilst the epiphyte coverage was correlated to the ice-ice incidence in K. alvarezii. This study provides the first evidence of biosecurity management measures significantly decreasing the incidence of the ice-ice syndrome and pests on a commercial seaweed farm.


Author(s):  
Sheree A Pagsuyoin ◽  
Joost R Santos

Water is a critical natural resource that sustains the productivity of many economic sectors, whether directly or indirectly. Climate change alongside rapid growth and development are a threat to water sustainability and regional productivity. In this paper, we develop an extension to the economic input-output model to assess the impact of water supply disruptions to regional economies. The model utilizes the inoperability variable, which measures the extent to which an infrastructure system or economic sector is unable to deliver its intended output. While the inoperability concept has been utilized in previous applications, this paper offers extensions that capture the time-varying nature of inoperability as the sectors recover from a disruptive event, such as drought. The model extension is capable of inserting inoperability adjustments within the drought timeline to capture time-varying likelihoods and severities, as well as the dependencies of various economic sectors on water. The model was applied to case studies of severe drought in two regions: (1) the state of Massachusetts (MA) and (2) the US National Capital Region (NCR). These regions were selected to contrast drought resilience between a mixed urban–rural region (MA) and a highly urban region (NCR). These regions also have comparable overall gross domestic products despite significant differences in the distribution and share of the economic sectors comprising each region. The results of the case studies indicate that in both regions, the utility and real estate sectors suffer the largest economic loss; nonetheless, results also identify region-specific sectors that incur significant losses. For the NCR, three sectors in the top 10 ranking of highest economic losses are government-related, whereas in the MA, four sectors in the top 10 are manufacturing sectors. Furthermore, the accommodation sector has also been included in the NCR case intuitively because of the high concentration of museums and famous landmarks. In contrast, the Wholesale Trade sector was among the sectors with the highest economic losses in the MA case study because of its large geographic size conducive for warehouses used as nodes for large-scale supply chain networks. Future modeling extensions could potentially include analysis of water demand and supply management strategies that can enhance regional resilience against droughts. Other regional case studies can also be pursued in future efforts to analyze various categories of drought severity beyond the case studies featured in this paper.


Sign in / Sign up

Export Citation Format

Share Document