synthetic catalog
Recently Published Documents


TOTAL DOCUMENTS

9
(FIVE YEARS 3)

H-INDEX

2
(FIVE YEARS 0)

Author(s):  
Luis Ceferino ◽  
Percy Galvez ◽  
Jean-Paul Ampuero ◽  
Anne Kiremidjian ◽  
Gregory Deierlein ◽  
...  

ABSTRACT This article introduces a framework to supplement short historical catalogs with synthetic catalogs and determine large earthquakes’ recurrence. For this assessment, we developed a parameter estimation technique for a probabilistic earthquake occurrence model that captures time and space interactions between large mainshocks. The technique is based on a two-step Bayesian update that uses a synthetic catalog from physics-based simulations for initial parameter estimation and then the historical catalog for further calibration, fully characterizing parameter uncertainty. The article also provides a formulation to combine multiple synthetic catalogs according to their likelihood of representing empirical earthquake stress drops and Global Positioning System-inferred interseismic coupling. We applied this technique to analyze large-magnitude earthquakes’ recurrence along 650 km of the subduction fault’s interface located offshore Lima, Peru. We built nine 2000 yr long synthetic catalogs using quasi-dynamic earthquake cycle simulations based on the rate-and-state friction law to supplement the 450 yr long historical catalog. When the synthetic catalogs are combined with the historical catalog without propagating their uncertainty, we found average relative reductions larger than 90% in the recurrence parameters’ uncertainty. When we propagated the physics-based simulations’ uncertainty to the posterior, the reductions in uncertainty decreased to 60%–70%. In two Bayesian assessments, we then show that using synthetic catalogs results in higher parameter uncertainty reductions than using only the historical catalog (69% vs. 60% and 83% vs. 80%), demonstrating that synthetic catalogs can be effectively combined with historical data, especially in tectonic regions with short historical catalogs. Finally, we show the implications of these results for time-dependent seismic hazard.



2021 ◽  
Author(s):  
Mariana Belferman ◽  
Amotz Agnon ◽  
Regina Katsman ◽  
Zvi Ben-Avraham

Abstract. Seismicity triggered by water level changes in reservoirs and lakes is usually studied from well-documented contemporary records. Can such triggering be explored on a historical time scale when the data gathered on water level fluctuations in historic lakes and the earthquake catalogs suffer from severe uncertainties? These uncertainties stem from the different nature of the data gathered, methods, and their resolution. In this article, we considerably improve the correlation between the continuous record of historic water level reconstructions at the Dead Sea and discrete seismicity patterns in the area over the period of the past two millennia. Constricted by the data from previous studies, we generate an ensemble of random water level curves and choose that curve that best correlates with the historical records of seismic stress release in the Dead Sea reflected in the destruction in Jerusalem. We then numerically simulate a synthetic earthquake catalog using this curve. The earthquakes of this synthetic catalog show an impressing agreement with historic earthquake records from the field. We demonstrate for the first time that water level changes correlate well with the observed recurrence interval record of historic earthquakes.



2020 ◽  
Vol 638 ◽  
pp. A94 ◽  
Author(s):  
A. Olejak ◽  
K. Belczynski ◽  
T. Bulik ◽  
M. Sobolewska

Aims. We present an open-access database that includes a synthetic catalog of black holes (BHs) in the Milky Way, divided by the components disk, bulge, and halo. Methods. To calculate the evolution of single and binary stars, we used the updated population synthesis code StarTrack. We applied a new model of the star formation history and chemical evolution of Galactic disk, bulge, and halo that was synthesized from observational and theoretical data. This model can be easily employed for other studies of population evolution. Results. We find that at the current Milky Way (disk+bulge+halo) contains about 1.2 × 108 single BHs with an average mass of about 14 M⊙, and 9.3 × 106 BHs in binary systems with an average mass of 19 M⊙. We present basic statistical properties of the BH population in three Galactic components such as the distributions of BH masses, velocities, or the numbers of BH binary systems in different evolutionary configurations. Conclusions. The metallicity of a stellar population has a significant effect on the final BH mass through the stellar winds. The most massive single BH in our simulation of 113 M⊙ originates from a merger of a BH and a helium star in a low-metallicity stellar environment in the Galactic halo. We constrain that only ∼0.006% of the total Galactic halo mass (including dark matter) can be hidden in the form of stellar origin BHs. These BHs cannot be detected by current observational surveys. We calculated the merger rates for current Galactic double compact objects (DCOs) for two considered common-envelope models: ∼3–81 Myr−1 for BH-BH, ∼1–9 Myr−1 for BH-neutron star (NS), and ∼14–59 Myr−1 for NS-NS systems. We show the evolution of the merger rates of DCOs since the formation of the Milky Way until the current moment with the new star formation model of the Galaxy.



2019 ◽  
Author(s):  
Xinquan Zheng ◽  
Yuan Liu ◽  
Yongjie Huang ◽  
Philippe Enkababian
Keyword(s):  


2013 ◽  
Vol 20 (1) ◽  
pp. 143-162 ◽  
Author(s):  
S. J. Nanda ◽  
K. F. Tiampo ◽  
G. Panda ◽  
L. Mansinha ◽  
N. Cho ◽  
...  

Abstract. In this paper we propose a tri-stage cluster identification model that is a combination of a simple single iteration distance algorithm and an iterative K-means algorithm. In this study of earthquake seismicity, the model considers event location, time and magnitude information from earthquake catalog data to efficiently classify events as either background or mainshock and aftershock sequences. Tests on a synthetic seismicity catalog demonstrate the efficiency of the proposed model in terms of accuracy percentage (94.81% for background and 89.46% for aftershocks). The close agreement between lambda and cumulative plots for the ideal synthetic catalog and that generated by the proposed model also supports the accuracy of the proposed technique. There is flexibility in the model design to allow for proper selection of location and magnitude ranges, depending upon the nature of the mainshocks present in the catalog. The effectiveness of the proposed model also is evaluated by the classification of events in three historic catalogs: California, Japan and Indonesia. As expected, for both synthetic and historic catalog analysis it is observed that the density of events classified as background is almost uniform throughout the region, whereas the density of aftershock events are higher near the mainshocks.



2012 ◽  
Vol 28 (2) ◽  
pp. 553-571 ◽  
Author(s):  
Jim Cousins ◽  
Geoff Thomas ◽  
Dave Heron ◽  
Warwick Smith

Wellington, the capital of New Zealand, has both high seismic and high post-earthquake fire risk because it straddles the highly active Wellington Fault, has many closely spaced wooden buildings, and has a fragile water supply system. Repeated modeling of a Wellington Fault earthquake showed that the distribution of fire losses was much broader than that of the shaking losses, so that while fire losses were usually much smaller than the preceding shaking losses, they could occasionally be much greater than the shaking losses. Probabilistic modeling using a synthetic catalog of earthquakes gave estimates of post-earthquake fire losses in Wellington that were relatively minor for return periods up to 1,000 years, equal to the shaking losses at about a 1,400-year level, and that dominated the losses for 2,000-year and longer return periods.



Sign in / Sign up

Export Citation Format

Share Document