scholarly journals Delayed Seismicity Rate Changes Controlled by Static Stress Transfer

2017 ◽  
Vol 122 (10) ◽  
pp. 7951-7965 ◽  
Author(s):  
Kayla A. Kroll ◽  
Keith B. Richards-Dinger ◽  
James H. Dieterich ◽  
Elizabeth S. Cochran
2017 ◽  
Vol 43 (4) ◽  
pp. 2093
Author(s):  
K. M. Leptokaropoulos ◽  
E. E. Papadimitriou ◽  
B. Orlecka–Sikora ◽  
V. G. Karakostas

The region of northern Aegean has suffered several strong earthquakes since the beginning of the 20th century, causing extensive damage and loss of lives. For the seismic hazard assessment in the area, several studies have been accomplished, among them being the ones dealing with the Coulomb stress changes due to the seismic slip caused by major earthquakes, in addition with the constant tectonic loading on the major regional faults. The aim of the present study is to evaluate if seismicity rate changes from 1964 to 2008 are associated with changes in the stress field. For this purpose the stronger events (Μw>5.8) that occurred in this period were considered and their contribution to the stress field evolution was investigated by calculations performed just before and after their occurrence. This influence was then examined in connection with the occurrence rate of small events (Μw > 3.8) for the respective time intervals. After defining the probability density function (PDF) of the small events distribution, a rate/state model was used to correlate static stress changes with seismicity rate and to compare the observed with the expected seismicity rate for each time period.


2021 ◽  
Vol 73 (1) ◽  
Author(s):  
Kodai Nakagomi ◽  
Toshiko Terakawa ◽  
Satoshi Matsumoto ◽  
Shinichiro Horikawa

An amendment to this paper has been published and can be accessed via the original article.


1982 ◽  
Vol 72 (1) ◽  
pp. 93-111
Author(s):  
R. E. Habermann

abstract Changes in the rate of occurrence of smaller events have been recognized in the rupture zones of upcoming large earthquakes in several postearthquake and one preearthquake study. A data set in which a constant portion of the events in any magnitude band are consistently reported through time is crucial for the recognition of seismicity rate changes which are real (related to some process change in the earth). Such a data set is termed a homogeneous data set. The consistency of reporting of earthquakes in the NOAA Hypocenter Data File (HDF) since 1963 is evaluated by examining the cumulative number of events reported as a function of time for the entire world in eight magnitude bands. It is assumed that the rate of occurrence of events in the entire world is roughly constant on the time scale examined here because of the great size of the worldwide earthquake production system. The rate of reporting of events with magnitudes above mb = 4.5 has been constant or increasing since 1963. Significant decreases in the number of events reported per month in the magnitude bands below mb = 4.4 occurred during 1968 and 1976. These decreases are interpreted as indications of decreases in detection of events for two reasons. First, they occur at times of constant rates of occurrence and reporting of larger events. Second, the decrease during the late 1960's has also been recognized in the teleseismic data reported by the International Seismological Centre (ISC). This suggests that the decrease in the number of small events reported was related to facets of the earthquake reporting system which the ISC and NOAA share. The most obvious candidate is the detection system. During 1968, detection decreased in the United States, Central and South America, and portions of the South Pacific. This decrease is probably due to the closure of the VELA arrays, BMO, TFO, CPO, UBO, and WMO. During 1976, detection decreased in most of the seismically active regions of the western hemisphere, as well as in the region between Kamchatka and Guam. The cause of this detection decrease is unclear. These detection decreases seriously affect the amount of homogeneous background period available for the study of teleseismic seismicity rate changes. If events below the minimum magnitude of homogeneity are eliminated from the teleseismic data sets the resulting small numbers of events render many regions unsuitable for study. Many authors have reported seismicity rate decreases as possible precursors to great earthquakes. Few of these authors have considered detection decreases as possible explanations for their results. This analysis indicates that such considerations cannot be avoided in studies of teleseismic data.


2020 ◽  
Vol 110 (2) ◽  
pp. 863-873 ◽  
Author(s):  
Margarita Segou ◽  
Tom Parsons

ABSTRACT Coseismic stress changes have been the primary physical principle used to explain aftershocks and triggered earthquakes. However, this method does not adequately forecast earthquake rates and diverse rupture populations when subjected to formal testing. We show that earthquake forecasts can be impaired by assumptions made in physics-based models such as the existence of hypothetical optimal faults and regional scale invariability of the stress field. We compare calculations made under these assumptions along with different realizations of a new conceptual triggering model that features a complete assay of all possible ruptures. In this concept, there always exists a set of theoretical planes that has positive failure stress conditions under a combination of background and coseismic static stress change. In the Earth, all of these theoretical planes may not exist, and if they do, they may not be ready to fail. Thus, the actual aftershock plane may not correspond to the plane with the maximum stress change value. This is consistent with observations that mainshocks commonly activate faults with exotic orientations and rakes. Our testing ground is the M 7.2, 2010 El Mayor–Cucapah earthquake sequence that activated multiple diverse fault populations across the United States–Mexico border in California and Baja California. We carry out a retrospective test involving 748 M≥3.0 triggered earthquakes that occurred during a 3 yr period after the mainshock. We find that a probabilistic expression of possible aftershock planes constrained by premainshock rupture patterns is strongly favored (89% of aftershocks consistent with static stress triggering) versus an optimal fault implementation (35% consistent). Results show that coseismic stress change magnitudes do not necessarily control earthquake triggering, instead we find that the summed background stress and coseismic stress change promotes diverse ruptures. Our model can thus explain earthquake triggering in regions where optimal plane mapping shows coseismic stress reduction.


2014 ◽  
Vol 19 (1) ◽  
pp. 273-273
Author(s):  
Athanassios Ganas ◽  
Zafeiria Roumelioti ◽  
Vassilios Karastathis ◽  
Konstantinos Chousianitis ◽  
Alexandra Moshou ◽  
...  

2009 ◽  
Vol 9 (3) ◽  
pp. 905-912 ◽  
Author(s):  
G. Chouliaras

Abstract. The earthquake catalog of the National Observatory of Athens (NOA) since the beginning of the Greek National Seismological Network development in 1964, is compiled and analyzed in this study. The b-value and the spatial and temporal variability of the magnitude of completeness of the catalog is determined together with the times of significant seismicity rate changes. It is well known that man made inhomogeneities and artifacts exist in earthquake catalogs that are produced by changing seismological networks and in this study the chronological order of periods of network expansion, instrumental upgrades and practice and procedures changes at NOA are reported. The earthquake catalog of NOA is the most detailed data set available for the Greek area and the results of this study may be employed for the selection of trustworthy parts of the data in earthquake prediction research.


2012 ◽  
Vol 188 (3) ◽  
pp. 1322-1338 ◽  
Author(s):  
K. M. Leptokaropoulos ◽  
E. E. Papadimitriou ◽  
B. Orlecka-Sikora ◽  
V. G. Karakostas

2013 ◽  
Vol 13 (2) ◽  
pp. 231-237 ◽  
Author(s):  
J. Takekawa ◽  
H. Mikada ◽  
T. Goto

Abstract. Recent researches have indicated coupling between volcanic eruptions and earthquakes. Some of them calculated static stress transfer in subsurface induced by the occurrences of earthquakes. Most of their analyses ignored the spatial heterogeneity in subsurface, or only took into account the rigidity layering in the crust. On the other hand, a smaller scale heterogeneity of around hundreds of meters has been suggested by geophysical investigations. It is difficult to reflect that kind of heterogeneity in analysis models because accurate distributions of fluctuation are not well understood in many cases. Thus, the effect of the ignorance of the smaller scale heterogeneity on evaluating the earthquake triggering of volcanic eruptions is also not well understood. In the present study, we investigate the influence of the assumption of homogeneity on evaluating earthquake triggering of volcanic eruptions using finite element simulations. The crust is treated as a stochastic media with different heterogeneous parameters (correlation length and magnitude of velocity perturbation) in our simulations. We adopt exponential and von Karman functions as spatial auto-correlation functions (ACF). In all our simulation results, the ignorance of the smaller scale heterogeneity leads to underestimation of the failure pressure around a chamber wall, which relates to dyke initiation. The magnitude of the velocity perturbation has a larger effect on the tensile failure at the chamber wall than the difference of the ACF and the correlation length. The maximum effect on the failure pressure in all our simulations is about twice larger than that in the homogeneous case. This indicates that the estimation of the earthquake triggering due to static stress transfer should take account of the heterogeneity of around hundreds of meters.


Sign in / Sign up

Export Citation Format

Share Document