The longer it has been since the last earthquake, the longer the expected time till the next?

1989 ◽  
Vol 79 (5) ◽  
pp. 1439-1456
Author(s):  
Paul M. Davis ◽  
David D. Jackson ◽  
Yan Y. Kagan

Abstract We adopt a lognormal distribution for earthquake interval times, and we use a locally determined rather than a generic coefficient of variation, to estimate the probability of occurrence of characteristic earthquakes. We extend previous methods in two ways. First, we account for the aseismic period since the last event (the “seismic drought”) in updating the parameter estimates. Second, in calculating the earthquake probability we allow for uncertainties in the mean recurrence time and its variance by averaging over their likelihood. Both extensions can strongly influence the calculated earthquake probabilities, especially for long droughts in regions with few documented earthquakes. As time passes, the recurrence time and variance estimates increase if no additional events occur, leading eventually to an affirmative answer to the question in the title. The earthquake risk estimate begins to drop when the drought exceeds the estimated recurrence time. For the Parkfield area of California, the probability of a magnitude 6 event in the next 5 years is about 34 per cent, much lower than previous estimates. Furthermore, the estimated 5-year probability will decrease with every uneventful year after 1988. For the Coachella Valley segment of the San Andreas Fault, the uncertainties are large, and we estimate the probability of a large event in the next 30 years to be 9 per cent, again much smaller than previous estimates. On the Mojave (Pallett Creek) segment the catalog includes 10 events, and the present drought is just approaching the recurrence interval, so the estimated risk is revised very little by our methods.

2020 ◽  
Author(s):  
Max Wyss

<p>The hypothesis that extrapolation of the Gutenberg-Richter (GR) relationship allows estimates of the probability of large earthquakes is incorrect. For nearly 200 faults for which the recurrence time, T<sub>r</sub> (1/probability of occurrence), is known from trenching and geodetically measured deformation rates, it has been shown that T<sub>r</sub> based on seismicity is overestimated typically by one order of magnitude or more. The reason for this is that there are not enough earthquakes along major faults. In some cases there are too few earthquakes for the fault to be mapped based on seismicity. Some examples are the following rupture segments of great faults: the 1717 Alpine Fault, the 1856 San Andreas, the 1906 San Andreas, the 2001 Denali earthquakes, for which geological Tr are 100 years to 300 years and seismicity T<sub>r</sub> are 10,000 to 100,000 years. In addition, the hypothesis leads to impossible results when one considers the dependence of the b-value on stress. It has been shown that thrusts, strike-slip and normal faults have low, intermediate and high b-values, respectively. This implies that, regardless of local slip rates, the probability of large earthquakes predicted by the hypothesis is high, intermediate and low in thrust, strike-slip, and normal faulting, respectively. Measurements of recurrence probability show a different dependence: earthquake probability depends on slip rate. Finally, the hypothesis predicts different probabilities for large earthquakes, depending on the magnitude scale used. For the 1906 rupture segment, the difference in probability of an M8 earthquake is approximately a factor of 50, using the two available catalogs. Various countries measure earthquake magnitude on their own scale that is intended to agree with the M<sub>L</sub> scale of California or the M<sub>S</sub> scale of the USGS. However, it is not trivial to match a scale that is valid for a different region with different attenuation of seismic waves. As a result, some regional M-scales differ from the global M<sub>S</sub> scale, which yields different T<sub>r</sub> for the same Mmax in the same region, depending on whether the global or local magnitude scale is used. Based on the aforementioned facts, the hypothesis that probabilities of large earthquakes can be estimated by extrapolating the GR relationship has to be abandoned.</p>


1991 ◽  
Vol 81 (3) ◽  
pp. 862-881 ◽  
Author(s):  
J. C. Savage

Abstract The Working Group on California Earthquake Probabilities has assigned probabilities for rupture in the interval from 1988 to 2018 to various segments of the San Andreas fault on the basis of the lognormal distribution of recurrence times of characteristic earthquakes postulated by Nishenko and Buland (1987). I question the validity of those probabilities on the basis of three separate arguments: (1) The distributions of recurrence times of the four, best-observed, characteristic-earthquake sequences are each only marginally consistent with the Nishenko - Buland Iognormal distribution. (2) The range of possible 30-year conditional probabilities for many of the fault segments is so great due to uncertainty in the average recurrence time for that segment that the assigned probability is virtually meaningless. (3) The 1988 forecasts not subject to the foregoing objection are those in which there is a low probability of an earthquake in the near future (e.g., only a 5 per cent chance of rupture of the North Coast segment before the year 2049 and of the Carrizo segment before the year 2018). The same reasoning would assign only a 5 per cent chance of rupture before mid-1993 to the southern Santa Cruz Mountains segment, the segment that failed in October 1989. Finally, the forecast of the next Parkfield earthquake (95 per cent probability before 1993.0) by Bakun and Lindh (1985) depends upon an ad hoc explanation of the out-of-sequence 1934 earthquake. A less-contrived forecast would have assigned a conditional probability of about 60 ± 20 per cent to the 1985.0 to 1993.0 interval and 30 ± 15 per cent to the 1990.0 to 1993.0 interval.


2010 ◽  
Vol 13 (2) ◽  
pp. 117-133
Author(s):  
Michio Naoi ◽  
◽  
Kazuto Sumita ◽  

The relationships between seismic risk and rental and owner- occupied housing prices in the whole of Japan are examined . The empirical results from hedonic regressions with earthquake risk indices suggest that: (1) earthquake occurrence probability has a significantly negative effect on monthly housing rent, (2) the effect of earthquake probability seems to depend on the characteristics of the individual housing unit (e.g. age of dwelling) for owner-occupied housing, (3) the estimated risk premium is much larger for older buildings, and (4) the share of quake-resistant dwellings in the neighborhood area is significantly and positively related to the housing price of the individual unit. These results suggest that anti-seismic policies that target specific groups of dwellings, such as rental houses and older buildings, help to mitigate welfare loss due to earthquakes.


2003 ◽  
Vol 60 (1) ◽  
pp. 97-103 ◽  
Author(s):  
Luciana Aparecida Carlini-Garcia ◽  
Roland Vencovsky ◽  
Alexandre Siqueira Guedes Coelho

Studying the genetic structure of natural populations is very important for conservation and use of the genetic variability available in nature. This research is related to genetic population structure analysis using real and simulated molecular data. To obtain variance estimates of pertinent parameters, the bootstrap resampling procedure was applied over different sampling units, namely: individuals within populations (I), populations (P), and individuals and populations simultaneously (I, P). The considered parameters were: the total fixation index (F or F IT), the fixation index within populations (f or F IS) and the divergence among populations or intrapopulation coancestry (theta or F ST). The aim of this research was to verify if the variance estimates of <IMG SRC="/img/fbpe/sa/v60n1/14549x09.gif">, <IMG SRC="/img/fbpe/sa/v60n1/14549x10.gif">and <IMG SRC="/img/fbpe/sa/v60n1/14549x11.gif">, found through the resampling over individuals and populations simultaneously (I, P), correspond to the sum of the respective variance estimates obtained from separated resampling over individuals and populations (I+P). This equivalence was verified in all cases, showing that the total variance estimate of <IMG SRC="/img/fbpe/sa/v60n1/14549x09.gif">, <IMG SRC="/img/fbpe/sa/v60n1/14549x10.gif">and <IMG SRC="/img/fbpe/sa/v60n1/14549x11.gif">can be obtained summing up the variances estimated for each source of variation separately. Results also showed that this facilitates the use of the bootstrap method on data with hierarchical structure and opens the possibility of obtaining the relative contribution of each source of variation to the total variation of estimated parameters.


Geosphere ◽  
2020 ◽  
Vol 16 (2) ◽  
pp. 474-489 ◽  
Author(s):  
Roby Douilly ◽  
David D. Oglesby ◽  
Michele L. Cooke ◽  
Jennifer L. Hatch

Abstract Geologic data suggest that the Coachella Valley segment of the southern San Andreas fault (southern California, USA) is past its average recurrence time period. At its northern edge, this right-lateral fault segment branches into the Mission Creek and Banning strands of the San Andreas fault. Depending on how rupture propagates through this region, there is the possibility of a throughgoing rupture that could lead to the channeling of damaging seismic energy into the Los Angeles Basin. The fault structures and potential rupture scenarios on these two strands differ significantly, which highlights the need to determine which strand provides a more likely rupture path and the circumstances that control this rupture path. In this study, we examine the effect of different assumptions about fault geometry and initial stress pattern on the dynamic rupture process to test multiple rupture scenarios and thus investigate the most likely path(s) of a rupture that starts on the Coachella Valley segment. We consider three types of fault geometry based on the Southern California Earthquake Center Community Fault Model, and we create a three-dimensional finite-element mesh for each of them. These three meshes are then incorporated into the finite-element method code FaultMod to compute a physical model for the rupture dynamics. We use a slip-weakening friction law, and consider different assumptions of background stress, such as constant tractions and regional stress regimes with different orientations. Both the constant and regional stress distributions show that rupture from the Coachella Valley segment is more likely to branch to the Mission Creek than to the Banning fault strand. The fault connectivity at this branch system seems to have a significant impact on the likelihood of a throughgoing rupture, with potentially significant impacts for ground motion and seismic hazard both locally and in the greater Los Angeles metropolitan area.


2002 ◽  
Vol 9 (5/6) ◽  
pp. 513-519 ◽  
Author(s):  
M. Vázquez-Prada ◽  
Á. González ◽  
J. B. Gómez ◽  
A. F. Pacheco

Abstract. In a spirit akin to the sandpile model of self-organized criticality, we present a simple statistical model of the cellular-automaton type which simulates the role of an asperity in the dynamics of a one-dimensional fault. This model produces an earthquake spectrum similar to the characteristic-earthquake behaviour of some seismic faults. This model, that has no parameter, is amenable to an algebraic description as a Markov Chain. This possibility illuminates some important results, obtained by Monte Carlo simulations, such as the earthquake size-frequency relation and the recurrence time of the characteristic earthquake.


2016 ◽  
Vol 47 (3) ◽  
pp. 1211
Author(s):  
P. Paradisopoulou ◽  
E. Papadimitriou ◽  
J. Mirek ◽  
V. Karakostas

Based on the fact that stress changes caused by the coseismic slip of strong events can be incorporated into quantitative earthquake probability estimates, the goal of this study is to estimate the probability of the next strong earthquake (M≥6.5) on a known fault segment in a future time interval (30 years). The probability depends on the calculation of ΔCFF and the estimate of the occurrence rate of a characteristic earthquake, conditioned to the elapsed time since the previous event. The Coulomb stress changes caused by previous earthquakes are computed and their influence are considered by the introduction of a permanent shift on the time elapsed since theprevious earthquake or by a modification of the expected mean recurrence time. The occurrence rate is calculated, taking into account both permanent and temporary perturbations. The estimated probability values correspond to the probabilities along each fault segment with discretization of 1km, illustrating the probability distribution across the specific fault. In order to check whether the estimated probability vary with depth, all the estimations were performed for each fault at depths of 8, 10, 12 and 15 km. 


Sign in / Sign up

Export Citation Format

Share Document