scholarly journals A Homogeneous Earthquake Catalogue for Turkey and Surrounding Region

2020 ◽  
Author(s):  
Onur Tan

Abstract. A new earthquake catalogue for Turkey and surrounding region (32°–47° N, 20°–52° E) is compiled for the period 1900–2017. The earthquake parameters are obtained from the Bulletin of International Seismological Centre that is fully updated in 2020. New conversion equations between moment magnitude and the other scales (md, ML, mb, Ms and M) are determined using in the General Orthogonal Regression method to build up a homogeneous catalogue, which is the essential data for seismic hazard studies. The 95 % confidence intervals are estimated using the bootstrap method with 1000 samples. The equivalent moment magnitudes (Mw*) for the entire catalogue are calculated using the magnitude relations to homogenise the catalogue. The magnitude of completeness is 2.9 Mw* and 3.0–3.2 Mw* for Turkey and Greece generally. The final dataset is not declustered or truncated using a threshold magnitude because of motivation for generating a widely usable catalogue. It contains not only Mw*, but also the average and median of the observed magnitudes for each event. Contrary to the limited earthquake parameters in the previous catalogues, the 45 parameters of approximately 700 k events occurred in a wide area from the Balkans to the Caucasus are presented.

2021 ◽  
Vol 21 (7) ◽  
pp. 2059-2073
Author(s):  
Onur Tan

Abstract. A new homogenized earthquake catalogue for Turkey is compiled for the period 1900–2018. The earthquake parameters are obtained from the Bulletin of International Seismological Centre that was fully updated in 2020. New conversion equations between moment magnitude and the other scales (md, ML, mb, Ms, and M) are determined using the general orthogonal regression method to build up a homogeneous catalogue, which is the essential database for seismic hazard studies. The 95 % confidence intervals are estimated using the bootstrap method with 1000 samples. The equivalent moment magnitudes (Mw*) for the entire catalogue are calculated using the magnitude relations to homogenize the catalogue. The magnitude of completeness is 2.7 Mw*. The final catalogue is not declustered or truncated using a threshold magnitude in order to be a widely usable catalogue. It contains not only Mw* but also the average and median of the observed magnitudes for each event. Contrary to the limited earthquake parameters in the previous catalogues for Turkey, the 45 parameters of ∼378 000 events are presented in this study.


CAUCHY ◽  
2018 ◽  
Vol 5 (3) ◽  
pp. 95
Author(s):  
Ovi Delviyanti Saputri ◽  
Ferra Yanuar ◽  
Dodi Devianto

<span lang="DE">Quantile regression is a regression method with the approach of separating or dividing data into certain quantiles by minimizing the number of absolute values from asymmetrical errors to overcome unfulfilled assumptions, including the presence of autocorrelation. The resulting model parameters are tested for accuracy using the bootstrap method. The bootstrap method is a parameter estimation method by re-sampling from the original sample as much as R replication. The bootstrap trust interval was then used as a test consistency test algorithm constructed on the estimator by the quantile regression method. And test the uncommon quantile regression method with bootstrap method. The data obtained in this test is data replication 10 times. The biasness is calculated from the difference between the quantile estimate and bootstrap estimation. Quantile estimation methods are said to be unbiased if the standard deviation bias is less than the standard bootstrap deviation. This study proves that the estimated value with quantile regression is within the bootstrap percentile confidence interval and proves that 10 times replication produces a better estimation value compared to other replication measures. Quantile regression method in this study is also able to produce unbiased parameter estimation values.</span>


Author(s):  
D Spallarossa ◽  
M Cattaneo ◽  
D Scafidi ◽  
M Michele ◽  
L Chiaraluce ◽  
...  

Summary The 2016–17 central Italy earthquake sequence began with the first mainshock near the town of Amatrice on August 24 (MW 6.0), and was followed by two subsequent large events near Visso on October 26 (MW 5.9) and Norcia on October 30 (MW 6.5), plus a cluster of 4 events with MW &gt; 5.0 within few hours on January 18, 2017. The affected area had been monitored before the sequence started by the permanent Italian National Seismic Network (RSNC), and was enhanced during the sequence by temporary stations deployed by the National Institute of Geophysics and Volcanology and the British Geological Survey. By the middle of September, there was a dense network of 155 stations, with a mean separation in the epicentral area of 6–10 km, comparable to the most likely earthquake depth range in the region. This network configuration was kept stable for an entire year, producing 2.5 TB of continuous waveform recordings. Here we describe how this data was used to develop a large and comprehensive earthquake catalogue using the Complete Automatic Seismic Processor (CASP) procedure. This procedure detected more than 450,000 events in the year following the first mainshock, and determined their phase arrival times through an advanced picker engine (RSNI-Picker2), producing a set of about 7 million P- and 10 million S-wave arrival times. These were then used to locate the events using a non-linear location (NLL) algorithm, a 1D velocity model calibrated for the area, and station corrections and then to compute their local magnitudes (ML). The procedure was validated by comparison of the derived data for phase picks and earthquake parameters with a handpicked reference catalogue (hereinafter referred to as ‘RefCat’). The automated procedure takes less than 12 hours on an Intel Core-i7 workstation to analyse the primary waveform data and to detect and locate 3000 events on the most seismically active day of the sequence. This proves the concept that the CASP algorithm can provide effectively real-time data for input into daily operational earthquake forecasts, The results show that there have been significant improvements compared to RefCat obtained in the same period using manual phase picks. The number of detected and located events is higher (from 84,401 to 450,000), the magnitude of completeness is lower (from ML 1.4 to 0.6), and also the number of phase picks is greater with an average number of 72 picked arrival for a ML = 1.4 compared with 30 phases for RefCat using manual phase picking. These propagate into formal uncertainties of ± 0.9km in epicentral location and ± 1.5km in depth for the enhanced catalogue for the vast majority of the events. Together, these provide a significant improvement in the resolution of fine structures such as local planar structures and clusters, in particular the identification of shallow events occurring in parts of the crust previously thought to be inactive. The lower completeness magnitude provides a rich data set for development and testing of analysis techniques of seismic sequences evolution, including real-time, operational monitoring of b-value, time-dependent hazard evaluation and aftershock forecasting.


Universe ◽  
2021 ◽  
Vol 7 (1) ◽  
pp. 8
Author(s):  
Alessandro Montoli ◽  
Marco Antonelli ◽  
Brynmor Haskell ◽  
Pierre Pizzochero

A common way to calculate the glitch activity of a pulsar is an ordinary linear regression of the observed cumulative glitch history. This method however is likely to underestimate the errors on the activity, as it implicitly assumes a (long-term) linear dependence between glitch sizes and waiting times, as well as equal variance, i.e., homoscedasticity, in the fit residuals, both assumptions that are not well justified from pulsar data. In this paper, we review the extrapolation of the glitch activity parameter and explore two alternatives: the relaxation of the homoscedasticity hypothesis in the linear fit and the use of the bootstrap technique. We find a larger uncertainty in the activity with respect to that obtained by ordinary linear regression, especially for those objects in which it can be significantly affected by a single glitch. We discuss how this affects the theoretical upper bound on the moment of inertia associated with the region of a neutron star containing the superfluid reservoir of angular momentum released in a stationary sequence of glitches. We find that this upper bound is less tight if one considers the uncertainty on the activity estimated with the bootstrap method and allows for models in which the superfluid reservoir is entirely in the crust.


1998 ◽  
Vol 217 (1) ◽  
Author(s):  
Hans Schneeberger

SummaryWith Efron’s law-school example the bootstrap method is compared with an alternative method, called doubling. It is shown, that the mean deviation of the estimator is always smaller for the doubling method.


1992 ◽  
Vol 82 (1) ◽  
pp. 104-119
Author(s):  
Michéle Lamarre ◽  
Brent Townshend ◽  
Haresh C. Shah

Abstract This paper describes a methodology to assess the uncertainty in seismic hazard estimates at particular sites. A variant of the bootstrap statistical method is used to combine the uncertainty due to earthquake catalog incompleteness, earthquake magnitude, and recurrence and attenuation models used. The uncertainty measure is provided in the form of a confidence interval. Comparisons of this method applied to various sites in California with previous studies are used to confirm the validity of the method.


2008 ◽  
Vol 33 (3) ◽  
pp. 257-278 ◽  
Author(s):  
Yuming Liu ◽  
E. Matthew Schulz ◽  
Lei Yu

A Markov chain Monte Carlo (MCMC) method and a bootstrap method were compared in the estimation of standard errors of item response theory (IRT) true score equating. Three test form relationships were examined: parallel, tau-equivalent, and congeneric. Data were simulated based on Reading Comprehension and Vocabulary tests of the Iowa Tests of Basic Skills®. For parallel and congeneric test forms within valid IRT true score ranges, the pattern and magnitude of standard errors of IRT true score equating estimated by the MCMC method were very close to those estimated by the bootstrap method. For tau-equivalent test forms, the pattern of standard errors estimated by the two methods was also similar. Bias and mean square errors of equating produced by the MCMC method were smaller than those produced by the bootstrap method; however, standard errors were larger. In educational testing, the MCMC method may be used as an additional or alternative procedure to the bootstrap method when evaluating the precision of equating results.


Balcanica ◽  
2014 ◽  
pp. 203-219
Author(s):  
Jelena Milojkovic-Djuric

At the beginning of his diplomatic career in Constantinople in 1835, David Urquhart was instrumental in promoting the British cause by endorsing its political grand design and mercantile interests in Turkey, Greece, the Caucasian region, Crimea, Serbia and adjacent Balkan principalities. While observing the complexities of the Eastern Question, Urquhart recognized the underlying importance that Serbia had attained in the context of competing imperial interests in the Balkans. His engaged commentaries on the crucial changes in Serbian political discourse elucidated as well his understanding of Serbian history and culture past and present. Urquhart discerned a correspondence between Serbian political affairs and the inherent situa?tion in the region of the Caucasus and Circassia.


Sign in / Sign up

Export Citation Format

Share Document