scholarly journals A homogeneous earthquake catalogue for Turkey

2021 ◽  
Vol 21 (7) ◽  
pp. 2059-2073
Author(s):  
Onur Tan

Abstract. A new homogenized earthquake catalogue for Turkey is compiled for the period 1900–2018. The earthquake parameters are obtained from the Bulletin of International Seismological Centre that was fully updated in 2020. New conversion equations between moment magnitude and the other scales (md, ML, mb, Ms, and M) are determined using the general orthogonal regression method to build up a homogeneous catalogue, which is the essential database for seismic hazard studies. The 95 % confidence intervals are estimated using the bootstrap method with 1000 samples. The equivalent moment magnitudes (Mw*) for the entire catalogue are calculated using the magnitude relations to homogenize the catalogue. The magnitude of completeness is 2.7 Mw*. The final catalogue is not declustered or truncated using a threshold magnitude in order to be a widely usable catalogue. It contains not only Mw* but also the average and median of the observed magnitudes for each event. Contrary to the limited earthquake parameters in the previous catalogues for Turkey, the 45 parameters of ∼378 000 events are presented in this study.

2020 ◽  
Author(s):  
Onur Tan

Abstract. A new earthquake catalogue for Turkey and surrounding region (32°–47° N, 20°–52° E) is compiled for the period 1900–2017. The earthquake parameters are obtained from the Bulletin of International Seismological Centre that is fully updated in 2020. New conversion equations between moment magnitude and the other scales (md, ML, mb, Ms and M) are determined using in the General Orthogonal Regression method to build up a homogeneous catalogue, which is the essential data for seismic hazard studies. The 95 % confidence intervals are estimated using the bootstrap method with 1000 samples. The equivalent moment magnitudes (Mw*) for the entire catalogue are calculated using the magnitude relations to homogenise the catalogue. The magnitude of completeness is 2.9 Mw* and 3.0–3.2 Mw* for Turkey and Greece generally. The final dataset is not declustered or truncated using a threshold magnitude because of motivation for generating a widely usable catalogue. It contains not only Mw*, but also the average and median of the observed magnitudes for each event. Contrary to the limited earthquake parameters in the previous catalogues, the 45 parameters of approximately 700 k events occurred in a wide area from the Balkans to the Caucasus are presented.


1992 ◽  
Vol 82 (1) ◽  
pp. 104-119
Author(s):  
Michéle Lamarre ◽  
Brent Townshend ◽  
Haresh C. Shah

Abstract This paper describes a methodology to assess the uncertainty in seismic hazard estimates at particular sites. A variant of the bootstrap statistical method is used to combine the uncertainty due to earthquake catalog incompleteness, earthquake magnitude, and recurrence and attenuation models used. The uncertainty measure is provided in the form of a confidence interval. Comparisons of this method applied to various sites in California with previous studies are used to confirm the validity of the method.


2018 ◽  
Vol 56 (1) ◽  
pp. 65-72
Author(s):  
Sudhir Rajaure ◽  
Lalu Prasad Paudel

We have prepared a comprehensive earthquake catalogue for Nepal and its adjoining region. The catalogue contains magnitude - homogenized independent earthquakes of magnitude (Mw) between 4.0 and 8.5, which occurred between 1100 AD and 2018 AD. The catalogue contains date, time, latitude, longitude, depth, and magnitude of earthquakes, which are required in the study of seismic activity, tectonics and seismic hazard. Primary earthquake catalogues were collected from the International Seismological Centre (ISC, 2015), United States Geological Survey (USGS), which contain instrumentally recorded earthquake data and date back to 1900 AD. These primary catalogues of instrumentally recorded earthquakes were supplemented by historical earthquakes reported in published literatures, which occurred before 1900 AD. The collected primary catalogues were compiled and processed to develop a comprehensive catalogue. The developed comprehensive catalogue is expected to serve as a basic database for the study of seismic activity and seismic hazard in Nepal and its adjacent area.


CAUCHY ◽  
2018 ◽  
Vol 5 (3) ◽  
pp. 95
Author(s):  
Ovi Delviyanti Saputri ◽  
Ferra Yanuar ◽  
Dodi Devianto

<span lang="DE">Quantile regression is a regression method with the approach of separating or dividing data into certain quantiles by minimizing the number of absolute values from asymmetrical errors to overcome unfulfilled assumptions, including the presence of autocorrelation. The resulting model parameters are tested for accuracy using the bootstrap method. The bootstrap method is a parameter estimation method by re-sampling from the original sample as much as R replication. The bootstrap trust interval was then used as a test consistency test algorithm constructed on the estimator by the quantile regression method. And test the uncommon quantile regression method with bootstrap method. The data obtained in this test is data replication 10 times. The biasness is calculated from the difference between the quantile estimate and bootstrap estimation. Quantile estimation methods are said to be unbiased if the standard deviation bias is less than the standard bootstrap deviation. This study proves that the estimated value with quantile regression is within the bootstrap percentile confidence interval and proves that 10 times replication produces a better estimation value compared to other replication measures. Quantile regression method in this study is also able to produce unbiased parameter estimation values.</span>


2019 ◽  
Vol 1 (1) ◽  
pp. 724-729 ◽  
Author(s):  
Renata Dwornicka ◽  
Norbert Radek ◽  
Jacek Pietraszek

AbstractThe paper considers the use of the bootstrap method to improve the determination of confidence intervals identified by the DOE (design of experiment) procedure. Two different approaches have been used: one that is appropriate for factorial designs and the other one relevant to the methodology of the response surface. Both approaches were tested on the real experiment datasets and compared with the results obtained from the classical statistical expressions based on well-known asymptotic formulas derived from the t distribution.


Author(s):  
D Spallarossa ◽  
M Cattaneo ◽  
D Scafidi ◽  
M Michele ◽  
L Chiaraluce ◽  
...  

Summary The 2016–17 central Italy earthquake sequence began with the first mainshock near the town of Amatrice on August 24 (MW 6.0), and was followed by two subsequent large events near Visso on October 26 (MW 5.9) and Norcia on October 30 (MW 6.5), plus a cluster of 4 events with MW &gt; 5.0 within few hours on January 18, 2017. The affected area had been monitored before the sequence started by the permanent Italian National Seismic Network (RSNC), and was enhanced during the sequence by temporary stations deployed by the National Institute of Geophysics and Volcanology and the British Geological Survey. By the middle of September, there was a dense network of 155 stations, with a mean separation in the epicentral area of 6–10 km, comparable to the most likely earthquake depth range in the region. This network configuration was kept stable for an entire year, producing 2.5 TB of continuous waveform recordings. Here we describe how this data was used to develop a large and comprehensive earthquake catalogue using the Complete Automatic Seismic Processor (CASP) procedure. This procedure detected more than 450,000 events in the year following the first mainshock, and determined their phase arrival times through an advanced picker engine (RSNI-Picker2), producing a set of about 7 million P- and 10 million S-wave arrival times. These were then used to locate the events using a non-linear location (NLL) algorithm, a 1D velocity model calibrated for the area, and station corrections and then to compute their local magnitudes (ML). The procedure was validated by comparison of the derived data for phase picks and earthquake parameters with a handpicked reference catalogue (hereinafter referred to as ‘RefCat’). The automated procedure takes less than 12 hours on an Intel Core-i7 workstation to analyse the primary waveform data and to detect and locate 3000 events on the most seismically active day of the sequence. This proves the concept that the CASP algorithm can provide effectively real-time data for input into daily operational earthquake forecasts, The results show that there have been significant improvements compared to RefCat obtained in the same period using manual phase picks. The number of detected and located events is higher (from 84,401 to 450,000), the magnitude of completeness is lower (from ML 1.4 to 0.6), and also the number of phase picks is greater with an average number of 72 picked arrival for a ML = 1.4 compared with 30 phases for RefCat using manual phase picking. These propagate into formal uncertainties of ± 0.9km in epicentral location and ± 1.5km in depth for the enhanced catalogue for the vast majority of the events. Together, these provide a significant improvement in the resolution of fine structures such as local planar structures and clusters, in particular the identification of shallow events occurring in parts of the crust previously thought to be inactive. The lower completeness magnitude provides a rich data set for development and testing of analysis techniques of seismic sequences evolution, including real-time, operational monitoring of b-value, time-dependent hazard evaluation and aftershock forecasting.


Universe ◽  
2021 ◽  
Vol 7 (1) ◽  
pp. 8
Author(s):  
Alessandro Montoli ◽  
Marco Antonelli ◽  
Brynmor Haskell ◽  
Pierre Pizzochero

A common way to calculate the glitch activity of a pulsar is an ordinary linear regression of the observed cumulative glitch history. This method however is likely to underestimate the errors on the activity, as it implicitly assumes a (long-term) linear dependence between glitch sizes and waiting times, as well as equal variance, i.e., homoscedasticity, in the fit residuals, both assumptions that are not well justified from pulsar data. In this paper, we review the extrapolation of the glitch activity parameter and explore two alternatives: the relaxation of the homoscedasticity hypothesis in the linear fit and the use of the bootstrap technique. We find a larger uncertainty in the activity with respect to that obtained by ordinary linear regression, especially for those objects in which it can be significantly affected by a single glitch. We discuss how this affects the theoretical upper bound on the moment of inertia associated with the region of a neutron star containing the superfluid reservoir of angular momentum released in a stationary sequence of glitches. We find that this upper bound is less tight if one considers the uncertainty on the activity estimated with the bootstrap method and allows for models in which the superfluid reservoir is entirely in the crust.


Sign in / Sign up

Export Citation Format

Share Document