Early Forecast of Maximum Amplitude due to Aftershocks by Applying Extreme Value Statistics to a Single Continuous Seismogram

Author(s):  
Kaoru Sawazaki

ABSTRACT To evaluate the exceedance probability of maximum amplitude (EPMA) due to aftershocks, I developed a forecasting scheme based on the extreme value statistics applied to a single continuous seismogram of early aftershocks. By combining the general laws of aftershock activity (Gutenberg–Richter and Omori–Utsu laws) and a ground-motion prediction equation (including source, path, and site factors), I verified that the interval maximum amplitude of a continuous seismogram of aftershocks follows the non-stationary Frechet distribution (NFD), which is one of the extreme value distributions. The parameters of NFD are written explicitly from the parameters commonly used in seismology. By optimizing the NFD parameters through the maximum-likelihood method and using the maximum-likelihood estimates and their covariance values, I derived the EPMA due to aftershocks based on the Bayesian approach. The performance of the EPMA was examined by Monte Carlo simulations and real seismograms. The numerically generated maximum amplitude was predicted well from the EPMA, which was evident even in the period of intense seismicity in which many waveforms overlap in a seismogram. This performance was also robust for real seismograms of aftershocks for the 2008 Iwate–Miyagi Nairiku, Japan, earthquake. The maximum amplitudes observed for four days were mostly within the 10% and 90% EPMA curves issued within 3 hr of the mainshock. The proposed method does not need to evaluate source, path, and site factors because these factors are included in the estimated NFD parameters. Given that the proposed method allows single-station processing, a seismic “network” is not required. Therefore, the proposed algorithm will be easily implementable in a seismic observation system installed at important facilities. Also, the NFD parameters estimated robustly in the early lapse times may provide important knowledge regarding early aftershocks.

2021 ◽  
Author(s):  
Kaoru Sawazaki

<p>Waveforms from many aftershocks occurring immediately after a large earthquake tend to overlap in a seismogram, which makes it difficult to pick their P- and S-wave phases. Accordingly, to determine hypocenter and magnitude of the aftershocks becomes difficult and thereby causes deterioration of earthquake catalog. Using such deteriorated catalog may cause misevaluation of ongoing aftershock activity. Since aftershock activity is usually most intense in the early period after a large earthquake, requirement of early aftershock forecast and deterioration of the aftershock catalog are impatient.</p><p>Several methods for aftershock forecast, using deteriorated automatic earthquake catalog (Omi et al., 2016, 2019) or continuous seismic envelopes (Lippiello et al., 2016), have been proposed to overcome such a situation. In this study, I propose another method that evaluates excess probability of maximum amplitude (EPMA) due to aftershocks using a continuous seismogram. The proposed method is based on the extreme value statistics, which provides probability distribution of maximum amplitudes within constant time intervals. From the Gutenberg-Richter and the Omori-Utsu laws and a conventional ground motion prediction equation (GMPE), I derived this interval maximum amplitude (IMA) follows the Frechet distribution (or type Ⅱ extreme-value distribution). Using the Monte-Carlo based approach, I certified that this distribution is well applicable to IMAs and available for forecasting maximum amplitudes even if many seismograms are overlapped.</p><p>Applying the Frechet distribution to the first 3 hour-long seismograms of the 2008 Iwate-Miyagi Nairiku earthquake (M<sub>W</sub> 6.9), Japan, I computed the EPMAs for 4 days at 4 stations. The maximum amplitudes due to experienced aftershocks proceeded following mostly within the 10 % to 90 % EPMA curves. This performance may be acceptable for a practical use.</p><p>Differently from the catalog-based method, the proposed method is almost unaffected by overlap of seismograms even in early lapse times. Since it is based on a single station processing, even seismic “network” is not required, and can be easily deployed at locations of poor seismic network coverage. So far, this method is correctly applicable for typical mainshock-aftershock (Omori-Utsu-like) sequence only. However, potentially, it could be extended to multiple sequences including secondary aftershocks and remotely triggered earthquakes.</p>


Author(s):  
Vijitashwa Pandey ◽  
Deborah Thurston

Design for disassembly and reuse focuses on developing methods to minimize difficulty in disassembly for maintenance or reuse. These methods can gain substantially if the relationship between component attributes (material mix, ease of disassembly etc.) and their likelihood of reuse or disposal is understood. For products already in the marketplace, a feedback approach that evaluates willingness of manufacturers or customers (decision makers) to reuse a component can reveal how attributes of a component affect reuse decisions. This paper introduces some metrics and combines them with ones proposed in literature into a measure that captures the overall value of a decision made by the decision makers. The premise is that the decision makers would choose a decision that has the maximum value. Four decisions are considered regarding a component’s fate after recovery ranging from direct reuse to disposal. A method on the lines of discrete choice theory is utilized that uses maximum likelihood estimates to determine the parameters that define the value function. The maximum likelihood method can take inputs from actual decisions made by the decision makers to assess the value function. This function can be used to determine the likelihood that the component takes a certain path (one of the four decisions), taking as input its attributes, which can facilitate long range planning and also help determine ways reuse decisions can be influenced.


Author(s):  
V.A. Simakhin ◽  
◽  
L.G. Shamanaeva ◽  
A.E. Avdyushina ◽  
◽  
...  

In the present work, a weighed maximum likelihood method (WMLM) is proposed to obtain robust estimates for processing experimental data containing outliers. The method allows robust asymptotic unbiased and effective estimates to be obtained in the presence of not only external, but also internal asymmetric and symmetric outliers. Algorithms for obtaining robust WMLM estimates are considered at the parametric level of aprioristic uncertainty. It is demonstrated that these estimates converge to maximum likelihood estimates of an inhomogeneous sample for each distribution from the Tukey supermodel.


Author(s):  
Fiaz Ahmad Bhatti ◽  
G. G. Hamedani ◽  
Haitham M. Yousof ◽  
Azeem Ali ◽  
Munir Ahmad

A flexible lifetime distribution with increasing, decreasing, inverted bathtub and modified bathtub hazard rate called Modified Burr XII-Inverse Weibull (MBXII-IW) is introduced and studied. The density function of MBXII-IW is exponential, left-skewed, right-skewed and symmetrical shaped.  Descriptive measures on the basis of quantiles, moments, order statistics and reliability measures are theoretically established. The MBXII-IW distribution is characterized via different techniques. Parameters of MBXII-IW distribution are estimated using maximum likelihood method. The simulation study is performed to illustrate the performance of the maximum likelihood estimates (MLEs). The potentiality of MBXII-IW distribution is demonstrated by its application to real data sets: serum-reversal times and quarterly earnings.


2007 ◽  
Vol 13 (45) ◽  
pp. 256
Author(s):  
ظافر حسين رشيد ◽  
انتصار عريبي الدوري

يتطرق البحث الى دراسة خمس توزيعات مختلفة وهي         Probit , Logistic, Arc sine , extreme value , One hit    لتقدير نموذج الجرعة – الاستجابة باستخدام طريقتي طريقة الإمكان الأعظم Maximum Likelihood Method  وطريقة وحدة الاحتمال Probit Method   من خلال تحديد أوزان مختلفة في كل توزيع وكذلك إيجاد كافة المؤشرات الخاصة بالنماذج الحياتية .


2021 ◽  
Author(s):  
Felix Fauer ◽  
Jana Ulrich ◽  
Oscar E. Jurado ◽  
Uwe Ulbrich ◽  
Henning W. Rust

<p>Intensity-Duration-Frequency (IDF) curves describe the main statistical characteristics of extreme precipitation events. Providing information on the exceedance probability or return period of certain precipitation intensities for a range of durations, IDF curves are an important tool for the design of hydrological structures.</p><p>Although the Generalized-Extreme-Value (GEV) distribution is an adequate model for annual precipitation maxima of a certain duration, the core problem of extreme value statistics remains: the limited data availability. Hence, it is reasonable to use a model that can describe all durations simultaneously. This reduces the total number of parameters and a more efficient usage of data is achieved. The idea of implementing a duration dependence directly into the parameters of the extreme value distribution and therefore obtaining a single distribution for a range of durations was proposed by Koutsoyiannis et al. (1998). However, while the use of the GEV is justified by a strong theoretical basis, only empirical models exist for the dependence of the parameters on duration.</p><p>In this study, we compare different models regarding the dependence of the GEV parameters on duration with the aim of finding a model for a wide duration range (1 min - 5 days). We use a combination of existing model features, especially curvature for small durations and multi-scaling for all durations, and extend them by a new feature that allows flattening of the IDF curves for long durations. Using the quantile score in a cross-validation setting, we provide detailed information on the duration and probability ranges for which specific features or a systematic combination of features lead to improved modeling skill.</p><p>Our results show that allowing curvature or multi-scaling improves the model only for very short or long durations, respectively, but leads to disadvantages in modeling the other duration ranges. In contrast, allowing flattening of the IDF curves leads to an improvement for medium durations between 1 hour and 1 day without affecting other duration regimes.</p>


1998 ◽  
Vol 28 (9) ◽  
pp. 1286-1294 ◽  
Author(s):  
F Soria ◽  
F Basurco ◽  
G Toval ◽  
L Silió ◽  
M C Rodriguez ◽  
...  

A Bayesian procedure coupled with Gibbs sampling was implemented to obtain inferences about genetic parameters and breeding values for height and diameter of 7-year-old Eucalyptus globulus Labill. is described. The data set consisted of 21 708 trees from 260 open-pollinated families taken from 10 different Australian provenances, from one Spanish population, and from two clones. The trees are distributed over eight sites in the south of Spain, with 20 blocks per site. Data were corrected for heterogeneity of phenotypic variances between blocks. In the analysis, a self-pollination rate of 30% for the open-pollinated families is assumed in the relationship matrix. The posterior means (and standard deviations) of the heritabilities of height and diameter and the genetic and phenotypic correlation were 0.217 (0.014), 0.128 (0.084), 0.768 (0.028), and 0.799 (0.003). Results from the standard restricted maximum likelihood method were 0.173, 0.113, 0.759, and 0.798, respectively. Most of the discrepancy in heritability estimates from both methods can be attributed to the adjustement of residual maximum likelihood estimates to the assumed self-pollination rate, which ignores the presence of clones in the trial. The effect of the method of prediction of breeding values (best linear unbiased prediction or Bayesian techniques) on the genetic superiority of the selected trees was not important. Differences in breeding value among provenances and among families were evidenced for both traits.


2012 ◽  
Author(s):  
Fadhilah Y. ◽  
Zalina Md. ◽  
Nguyen V–T–V. ◽  
Suhaila S. ◽  
Zulkifli Y.

Dalam mengenal pasti model yang terbaik untuk mewakili taburan jumlah hujan bagi data selang masa satu jam di 12 stesen di Wilayah Persekutuan empat taburan digunakan iaitu Taburan Eksponen, Gamma, Weibull dan Gabungan Eksponen. Parameter–parameter dianggar menggunakan kaedah kebolehjadian maksimum. Model yang terbaik dipilih berdasarkan nilai minimum yang diperolehi daripada ujian–ujian kebagusan penyuaian yang digunakan dalam kajian ini. Ujian ini dipertahankan lagi dengan plot kebarangkalian dilampaui. Taburan Gabungan Eksponen di dapati paling baik untuk mewakili taburan jumlah hujan dalam selang masa satu jam. Daripada anggaran parameter bagi taburan Gabungan Eksponen ini, boleh diterjemah bahawa jumlah hujan tertinggi yang direkodkan diperolehi daripada hujan yang dikategorikan sebagai hujan lebat, walaupun hujan renyai–renyai berlaku lebih kerap. Kata kunci: Jumlah hujan dalam selang masa sejam, ujian kebagusan penyuaian, kebolehjadian maksimum In determining the best–fit model for the hourly rainfall amounts for the twelve stations in the Wilayah Persekutuan, four distributions namely, the Exponential, Gamma, Weibull and Mixed–Exponential were used. Parameters for each distribution were estimated using the maximum likelihood method. The best–fit model was chosen based upon the minimum error produced by the goodness–offit tests used in this study. The tests were justified further by the exceedance probability plot. The Mixed–Exponential was found to be the most appropriate distribution in describing the hourly rainfall amounts. From the parameter estimates for the Mixed–Exponential distribution, it could be implied that most of the hourly rainfall amount recorded were received from the heavy rainfall even though there was a high occurrences of light rainfall. Key words: Hourly rainfall amount, goodness-of-fit test, exceedance probability, maximum likelihood


2012 ◽  
Vol 53 ◽  
Author(s):  
Leonidas Sakalauskas ◽  
Ingrida Vaičiulytė

The present paper describes the empirical Bayesian approach applied in the estimation of several small rates. Modeling by empirical Bayesian approach the probabilities of several rare events, it is assumed that the frequencies of events follow to Poisson’s law with different parameters, which are correlated Gaussian random values. The unknown parameters are estimated by the maximum likelihood method computing the integrals appeared here by Hermite–Gauss quadratures. The equations derived that are satisfied by maximum likelihood estimates of model parameters.


Sign in / Sign up

Export Citation Format

Share Document