Contemporary Earthquake Hazards in the West-Northwest Himalaya: A Statistical Perspective through Natural Times

2020 ◽  
Vol 91 (6) ◽  
pp. 3358-3369
Author(s):  
Sumanta Pasari ◽  
Yogendra Sharma

Abstract Himalayan earthquakes have deep societal and economic impact. In this article, we implement a surrogate method of nowcasting (Rundle et al., 2016) to determine the current state of seismic hazard from large earthquakes in a dozen populous cities from India and Pakistan that belong to the west-northwest part of Himalayan orogeny. For this, we (1) perform statistical inference of natural times, intersperse counts of small-magnitude events between pairs of succeeding large events, based on a set of eight probability distributions; (2) compute earthquake potential score (EPS) of 14 cities from the best-fit cumulative distribution of natural times; and (3) carry out a sensitivity testing of parameters—threshold magnitude and area of city region. Formulation of natural time (Varostos et al., 2005) based on frequency–magnitude power-law statistics essentially avoids the daunting need of seismicity declustering in hazard estimation. A retrospective analysis of natural time counts corresponding to M≥6 events for the Indian cities provides an EPS (%) as New Delhi (56), Chandigarh (86), Dehradun (83), Jammu (99), Ludhiana (89), Moradabad (84), and Shimla (87), whereas the cities in Pakistan observe an EPS (%) as Islamabad (99), Faisalabad (88), Gujranwala (99), Lahore (89), Multan (98), Peshawar (38), and Rawalpindi (99). The estimated nowcast values that range from 38% to as high as 99% lead to a rapid yet useful ranking of cities in terms of their present progression to the regional earthquake cycle of magnitude ≥6.0 events. The analysis inevitably encourages scientists and engineers from governments and industry to join hands for better policymaking toward land-use planning, insurance, and disaster preparation in the west-northwest part of active Himalayan belt.

Author(s):  
RONALD R. YAGER

We look at the issue of obtaining a variance like measure associated with probability distributions over ordinal sets. We call these dissonance measures. We specify some general properties desired in these dissonance measures. The centrality of the cumulative distribution function in formulating the concept of dissonance is pointed out. We introduce some specific examples of measures of dissonance.


2018 ◽  
Vol 146 (12) ◽  
pp. 4079-4098 ◽  
Author(s):  
Thomas M. Hamill ◽  
Michael Scheuerer

Abstract Hamill et al. described a multimodel ensemble precipitation postprocessing algorithm that is used operationally by the U.S. National Weather Service (NWS). This article describes further changes that produce improved, reliable, and skillful probabilistic quantitative precipitation forecasts (PQPFs) for single or multimodel prediction systems. For multimodel systems, final probabilities are produced through the linear combination of PQPFs from the constituent models. The new methodology is applied to each prediction system. Prior to adjustment of the forecasts, parametric cumulative distribution functions (CDFs) of model and analyzed climatologies are generated using the previous 60 days’ forecasts and analyses and supplemental locations. The CDFs, which can be stored with minimal disk space, are then used for quantile mapping to correct state-dependent bias for each member. In this stage, the ensemble is also enlarged using a stencil of forecast values from the 5 × 5 surrounding grid points. Different weights and dressing distributions are assigned to the sorted, quantile-mapped members, with generally larger weights for outlying members and broader dressing distributions for members with heavier precipitation. Probability distributions are generated from the weighted sum of the dressing distributions. The NWS Global Ensemble Forecast System (GEFS), the Canadian Meteorological Centre (CMC) global ensemble, and the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble forecast data are postprocessed for April–June 2016. Single prediction system postprocessed forecasts are generally reliable and skillful. Multimodel PQPFs are roughly as skillful as the ECMWF system alone. Postprocessed guidance was generally more skillful than guidance using the Gamma distribution approach of Scheuerer and Hamill, with coefficients generated from data pooled across the United States.


Author(s):  
Chi-Hua Chen ◽  
Fangying Song ◽  
Feng-Jang Hwang ◽  
Ling Wu

To generate a probability density function (PDF) for fitting probability distributions of real data, this study proposes a deep learning method which consists of two stages: (1) a training stage for estimating the cumulative distribution function (CDF) and (2) a performing stage for predicting the corresponding PDF. The CDFs of common probability distributions can be adopted as activation functions in the hidden layers of the proposed deep learning model for learning actual cumulative probabilities, and the differential equation of trained deep learning model can be used to estimate the PDF. To evaluate the proposed method, numerical experiments with single and mixed distributions are performed. The experimental results show that the values of both CDF and PDF can be precisely estimated by the proposed method.


Author(s):  
Md. Mahabubur Rahman ◽  
Bander Al-Zahrani ◽  
Saman Hanif Shahbaz ◽  
Muhammad Qaiser Shahbaz

Transmutation is the functional composition of the cumulative distribution function (cdf) of one distribution with the inverse cumulative distribution function (quantile function) of another. Shaw and Buckley(2007), first apply this concept and introduced quadratic transmuted family of distributions. In this article, we have presented a review about the transmuted families of distributions. We have also listed the transmuted distributions, available in the literature along with some concluding remarks.


Geosciences ◽  
2020 ◽  
Vol 10 (11) ◽  
pp. 471
Author(s):  
Sambit Prasanajit Naik ◽  
Ohsang Gwon ◽  
Sabina Porfido ◽  
Kiwoong Park ◽  
Kwangmin Jin ◽  
...  

The earthquake environmental effects (EEEs) around the epicentral area of the Pohang earthquake (Mw-5.4) that occurred on 15 November 2017 have been collected and classified using the Environmental Seismic Intensity Scale (ESI-07 scale) proposed by the International Union for Quaternary Research (INQUA) focus group. The shallow-focus 15 November Pohang earthquake did not produce any surface rupture, but caused extensive secondary environmental effects and damage to life-line structures. This earthquake was one of the most damaging earthquakes during the instrumental seismic era of the Korean Peninsula. The EEEs included extensive liquefaction, ground cracks, ground settlement, localized rockfall, and variation of the water table. The main objective of this paper was to carry forward a comparative assessment of the Pohang earthquake’s intensity based on traditional macroseismic scales and the ESI-07 scale. With that objective, this study will also make a substantial contribution to any future revision of the ESI-07 scale, which mostly comprises case studies from Europe and South America. The comparison of the ESI-07 scale with traditional intensity scales similar to the intensity scale used by the Korean Meteorological Administration for the epicentral areas showed 1–2-degree differences in intensity. Moreover, the ESI scale provided a clearer picture of the intensity around the epicentral area, which is mostly agricultural land with a lack of urban units or buildings. This study urges the integration of the traditional and ESI-07 scale for such small magnitude earthquakes in the Korean Peninsula as well as around the world in future. This will predict seismic intensity more precisely and hence provide a more-effective seismic hazard estimation, particularly in areas of low seismic activity. The present study will also provide a useful and reliable tool for the seismic hazard assessment of similar earthquakes around the study area and land-use planning at a local scale considering the secondary effects.


2014 ◽  
Vol 2014 ◽  
pp. 1-11
Author(s):  
Sergey V. Gurov ◽  
Lev V. Utkin

A new load-share reliability model of systems under the changeable load is proposed in the paper. It is assumed that the load is a piecewise smooth function which can be regarded as an extension of the piecewise constant and continuous functions. The condition of the residual lifetime conservation, which means continuity of a cumulative distribution function of time to failure, is accepted in the proposed model. A general algorithm for computing reliability measures is provided. Simple expressions for determining the survivor functions under assumption of the Weibull probability distribution of time to failure are given. Various numerical examples illustrate the proposed model by different forms of the system load and different probability distributions of time to failure.


2011 ◽  
Vol 133 (2) ◽  
Author(s):  
Kais Zaman ◽  
Mark McDonald ◽  
Sankaran Mahadevan

This paper develops and illustrates a probabilistic approach for uncertainty representation and propagation in system analysis, when the information on the uncertain input variables and/or their distribution parameters may be available as either probability distributions or simply intervals (single or multiple). A unique aggregation technique is used to combine multiple interval data and to compute rigorous bounds on the system response cumulative distribution function. The uncertainty described by interval data is represented through a flexible family of probability distributions. Conversion of interval data to a probabilistic format enables the use of computationally efficient methods for probabilistic uncertainty propagation. Two methods are explored for the implementation of the proposed approach, based on (1) sampling and (2) optimization. The sampling-based strategy is more expensive and tends to underestimate the output bounds. The optimization-based methodology improves both aspects. The proposed methods are used to develop new solutions to challenge problems posed by the Sandia epistemic uncertainty workshop (Oberkampf et al., 2004, “Challenge Problems: Uncertainty in System Response Given Uncertain Parameters,” Reliab. Eng. Syst. Saf., 85, pp. 11–19). Results for the challenge problems are compared with earlier solutions.


2021 ◽  
Vol 3 (1) ◽  
pp. 16-25
Author(s):  
Siti Mariam Norrulashikin ◽  
Fadhilah Yusof ◽  
Siti Rohani Mohd Nor ◽  
Nur Arina Bazilah Kamisan

Modeling meteorological variables is a vital aspect of climate change studies. Awareness of the frequency and magnitude of climate change is a critical concern for mitigating the risks associated with climate change. Probability distribution models are valuable tools for a frequency study of climate variables since it measures how the probability distribution able to fit well in the data series. Monthly meteorological data including average temperature, wind speed, and rainfall were analyzed in order to determine the most suited probability distribution model for Kuala Krai district. The probability distributions that were used in the analysis were Beta, Burr, Gamma, Lognormal, and Weibull distributions. To estimate the parameters for each distribution, the maximum likelihood estimate (MLE) was employed. Goodness-of-fit tests such as the Kolmogorov-Smirnov, and Anderson-Darling tests were conducted to assess the best suited model, and the test's reliability. Results from statistical studies indicate that Burr distributions better characterize the meteorological data of our research. The graph of probability density function, cumulative distribution function as well as Q-Q plot are presented.


2014 ◽  
Vol 39 ◽  
pp. 69-73 ◽  
Author(s):  
M. Jiménez ◽  
S. Castanedo ◽  
Z. Zhou ◽  
G. Coco ◽  
R. Medina ◽  
...  

Abstract. Long-term simulations (3000 yr) of an idealized basin using different tidal ranges (1, 2 and 3 m) and grain sizes (120, 480 and 960 μm) have been performed in order to cover a range of hydrodynamic and sedimentary conditions. Two different cell sizes (50 and 100 m) have been used to study the impact of cell size on tidal network development. The probability distributions of the drainage area and the drainage volume have been computed for every simulation (during an ebb and a flood phase). Power law distributions are observed in drainage area and drainage volume distribution. As an objective estimation of the exponent of a power law is an open issue, different approaches (linear binning, normalized logarithmic binning, cumulative distribution function and maximum likelihood) proposed by White et al. (2008) to estimate the exponent have been used to carry out a sensitivity analysis. Our findings indicate that although all methods results in high and significant correlation coefficients, more work is needed to develop a universal, objective estimation of the exponent.


Sign in / Sign up

Export Citation Format

Share Document