synthetic catalogs
Recently Published Documents


TOTAL DOCUMENTS

6
(FIVE YEARS 1)

H-INDEX

2
(FIVE YEARS 0)

Author(s):  
Luis Ceferino ◽  
Percy Galvez ◽  
Jean-Paul Ampuero ◽  
Anne Kiremidjian ◽  
Gregory Deierlein ◽  
...  

ABSTRACT This article introduces a framework to supplement short historical catalogs with synthetic catalogs and determine large earthquakes’ recurrence. For this assessment, we developed a parameter estimation technique for a probabilistic earthquake occurrence model that captures time and space interactions between large mainshocks. The technique is based on a two-step Bayesian update that uses a synthetic catalog from physics-based simulations for initial parameter estimation and then the historical catalog for further calibration, fully characterizing parameter uncertainty. The article also provides a formulation to combine multiple synthetic catalogs according to their likelihood of representing empirical earthquake stress drops and Global Positioning System-inferred interseismic coupling. We applied this technique to analyze large-magnitude earthquakes’ recurrence along 650 km of the subduction fault’s interface located offshore Lima, Peru. We built nine 2000 yr long synthetic catalogs using quasi-dynamic earthquake cycle simulations based on the rate-and-state friction law to supplement the 450 yr long historical catalog. When the synthetic catalogs are combined with the historical catalog without propagating their uncertainty, we found average relative reductions larger than 90% in the recurrence parameters’ uncertainty. When we propagated the physics-based simulations’ uncertainty to the posterior, the reductions in uncertainty decreased to 60%–70%. In two Bayesian assessments, we then show that using synthetic catalogs results in higher parameter uncertainty reductions than using only the historical catalog (69% vs. 60% and 83% vs. 80%), demonstrating that synthetic catalogs can be effectively combined with historical data, especially in tectonic regions with short historical catalogs. Finally, we show the implications of these results for time-dependent seismic hazard.



2020 ◽  
Vol 110 (4) ◽  
pp. 1799-1817 ◽  
Author(s):  
William H. Savran ◽  
Maximilian J. Werner ◽  
Warner Marzocchi ◽  
David A. Rhoades ◽  
David D. Jackson ◽  
...  

ABSTRACT The 2019 Ridgecrest sequence provides the first opportunity to evaluate Uniform California Earthquake Rupture Forecast v.3 with epidemic-type aftershock sequences (UCERF3-ETAS) in a pseudoprospective sense. For comparison, we include a version of the model without explicit faults more closely mimicking traditional ETAS models (UCERF3-NoFaults). We evaluate the forecasts with new metrics developed within the Collaboratory for the Study of Earthquake Predictability (CSEP). The metrics consider synthetic catalogs simulated by the models rather than synoptic probability maps, thereby relaxing the Poisson assumption of previous CSEP tests. Our approach compares statistics from the synthetic catalogs directly against observations, providing a flexible approach that can account for dependencies and uncertainties encoded in the models. We find that, to the first order, both UCERF3-ETAS and UCERF3-NoFaults approximately capture the spatiotemporal evolution of the Ridgecrest sequence, adding to the growing body of evidence that ETAS models can be informative forecasting tools. However, we also find that both models mildly overpredict the seismicity rate, on average, aggregated over the evaluation period. More severe testing indicates the overpredictions occur too often for observations to be statistically indistinguishable from the model. Magnitude tests indicate that the models do not include enough variability in forecasted magnitude-number distributions to match the data. Spatial tests highlight discrepancies between the forecasts and observations, but the greatest differences between the two models appear when aftershocks occur on modeled UCERF3-ETAS faults. Therefore, any predictability associated with embedding earthquake triggering on the (modeled) fault network may only crystalize during the presumably rare sequences with aftershocks on these faults. Accounting for uncertainty in the model parameters could improve test results during future experiments.



2020 ◽  
Author(s):  
Kayla Kroll ◽  
Gene Ichinose ◽  
Sean Ford ◽  
Arben Pitarka ◽  
William Walter ◽  
...  

<p>Previous studies have shown that explosion sources produce fewer aftershocks and that they are generally smaller in magnitude compared to aftershocks of similarly sized earthquake sources (Jarpe et al., 1994, Ford and Walter, 2010). It has also been suggested that the explosion-induced aftershocks have smaller Gutenberg-Richter b-values (Ryall and Savage, 1969, Ford and Labak, 2016) and that their rates decay faster than a typical Omori-like sequence (Gross, 1996). Recent chemical explosion experiments at the Nevada National Security Site (NNSS) were observed to generate vigorous aftershock activity and allow for further comparison between earthquake- and explosion-triggered aftershocks. Of the four recent chemical explosion experiments conducted between July 2018 and June 2019, the two largest explosions (i.e. 10-ton and 50-ton) generated hundreds to thousands of aftershocks. Preliminary analysis indicates that these aftershock sequences have similar statistical characteristics to traditional tectonically driven aftershocks in the region.</p><p> </p><p>The physical mechanisms that contribute to differences in aftershock behavior following earthquake and explosion sources are poorly understood. Possible mechanisms may be related to weak material properties in the shallow subsurface that do not give rise to stress concentrations large enough to support brittle failure. Additionally, minimal changes in the shear component of the stress tensor for explosion sources may also contribute to differences in aftershock distributions. Here, we compare aftershock statistics and productivity of the explosion-related aftershocks at the NNSS site to synthetic catalogs of aftershocks triggered by explosion sources. These synthetic catalogs are built by coupling strains that result from modeling the explosion source process with the SW4 wave propagation code with the 3D physics-based earthquake simulation code, RSQSim. We compare statistical properties of the aftershock sequence (e.g. productivity, maximum aftershock magnitude, Omori decay rate) and the spatiotemporal relationship between stress changes and event locations of the synthetic and observed aftershocks to understand the primary mechanisms that control them.</p><p>Prepared by LLNL under Contract DE-AC52-07NA27344.</p>



2018 ◽  
Vol 108 (2) ◽  
pp. 729-741 ◽  
Author(s):  
Morgan T. Page ◽  
Nicholas J. van der Elst
Keyword(s):  


2017 ◽  
Vol 50 (3) ◽  
pp. 1463
Author(s):  
D.A. Vamvakaris ◽  
C.B. Papazachos ◽  
Ch.A. Papaioannou ◽  
E.M. Scordilis ◽  
G.F. Karakaisis

In order to evaluate the seismic hazard for the broader Aegean area, a modified timeindependent seismicity model is used. A Monte-Carlo procedure has been employed to create synthetic earthquake catalogs with specific characteristics regarding their time, space and magnitude distributions. Moreover, particular geometrical characteristics, such as subducting and oblique seismic zones are also taken into account in the synthetic catalogs generation. A significantly revised earthquake catalog, all available fault plane solutions and information on the seismotectonics of the broader Aegean area were considered in order to propose a new updated model of seismic zones for this area. Seismicity parameters for the new seismic zones were calculated and the corresponding synthetic earthquake catalogs were generated using the proposed procedure. The distribution of the expected values for ground motion parameters (e.g. PGA, PGV) was estimated using synthetic catalogs for several sites of interest, by performing computations directly on all earthquakes of each catalog. Computations were performed for a dense grid of sites and seismic hazard estimates were determined both directly from the peak ground motion distribution, as well as from fitted extreme values Gumbel distribution. Ground motion parameters were also calculated using classic seismic hazard assessment algorithms (EqRISK), in order to evaluate the compatibility of the proposed method with conventional approaches.





Sign in / Sign up

Export Citation Format

Share Document