Discrete Pareto Distributions

2014 ◽  
Vol 29 (2) ◽  
Author(s):  
Amrutha Buddana ◽  
Tomasz J. Kozubowski

AbstractWe review several common discretization schemes and study a particular class of power-tail probability distributions on integers, obtained by discretizing continuous Pareto II (Lomax) distribution through one of them. Our results include expressions for the density and cumulative distribution functions, probability generating function, moments and related parameters, stability and divisibility properties, stochastic representations, and limiting distributions of random sums with discrete-Pareto number of terms. We also briefly discuss issues of simulation and estimation and extensions to multivariate setting.

2018 ◽  
Vol 146 (12) ◽  
pp. 4079-4098 ◽  
Author(s):  
Thomas M. Hamill ◽  
Michael Scheuerer

Abstract Hamill et al. described a multimodel ensemble precipitation postprocessing algorithm that is used operationally by the U.S. National Weather Service (NWS). This article describes further changes that produce improved, reliable, and skillful probabilistic quantitative precipitation forecasts (PQPFs) for single or multimodel prediction systems. For multimodel systems, final probabilities are produced through the linear combination of PQPFs from the constituent models. The new methodology is applied to each prediction system. Prior to adjustment of the forecasts, parametric cumulative distribution functions (CDFs) of model and analyzed climatologies are generated using the previous 60 days’ forecasts and analyses and supplemental locations. The CDFs, which can be stored with minimal disk space, are then used for quantile mapping to correct state-dependent bias for each member. In this stage, the ensemble is also enlarged using a stencil of forecast values from the 5 × 5 surrounding grid points. Different weights and dressing distributions are assigned to the sorted, quantile-mapped members, with generally larger weights for outlying members and broader dressing distributions for members with heavier precipitation. Probability distributions are generated from the weighted sum of the dressing distributions. The NWS Global Ensemble Forecast System (GEFS), the Canadian Meteorological Centre (CMC) global ensemble, and the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble forecast data are postprocessed for April–June 2016. Single prediction system postprocessed forecasts are generally reliable and skillful. Multimodel PQPFs are roughly as skillful as the ECMWF system alone. Postprocessed guidance was generally more skillful than guidance using the Gamma distribution approach of Scheuerer and Hamill, with coefficients generated from data pooled across the United States.


2021 ◽  
Vol 71 (2) ◽  
pp. 475-490
Author(s):  
Shokofeh Zinodiny ◽  
Saralees Nadarajah

Abstract Matrix variate generalizations of Pareto distributions are proposed. Several properties of these distributions including cumulative distribution functions, characteristic functions and relationship to matrix variate beta type I and matrix variate type II distributions are studied.


Symmetry ◽  
2020 ◽  
Vol 12 (12) ◽  
pp. 2108
Author(s):  
Weaam Alhadlaq ◽  
Abdulhamid Alzaid

Archimedean copulas form a very wide subclass of symmetric copulas. Most of the popular copulas are members of the Archimedean copulas. These copulas are obtained using real functions known as Archimedean generators. In this paper, we observe that under certain conditions the cumulative distribution functions on (0, 1) and probability generating functions can be used as Archimedean generators. It is shown that most of the well-known Archimedean copulas can be generated using such distributions. Further, we introduced new Archimedean copulas.


Author(s):  
Q. Liu ◽  
L. S Chiu ◽  
X. Hao

The abundance or lack of rainfall affects peoples’ life and activities. As a major component of the global hydrological cycle (Chokngamwong & Chiu, 2007), accurate representations at various spatial and temporal scales are crucial for a lot of decision making processes. Climate models show a warmer and wetter climate due to increases of Greenhouse Gases (GHG). However, the models’ resolutions are often too coarse to be directly applicable to local scales that are useful for mitigation purposes. Hence disaggregation (downscaling) procedures are needed to transfer the coarse scale products to higher spatial and temporal resolutions. The aim of this paper is to examine the changes in the statistical parameters of rainfall at various spatial and temporal resolutions. The TRMM Multi-satellite Precipitation Analysis (TMPA) at 0.25 degree, 3 hourly grid rainfall data for a summer is aggregated to 0.5,1.0, 2.0 and 2.5 degree and at 6, 12, 24 hourly, pentad (five days) and monthly resolutions. The probability distributions (PDF) and cumulative distribution functions(CDF) of rain amount at these resolutions are computed and modeled as a mixed distribution. Parameters of the PDFs are compared using the Kolmogrov-Smironov (KS) test, both for the mixed and the marginal distribution. These distributions are shown to be distinct. The marginal distributions are fitted with Lognormal and Gamma distributions and it is found that the Gamma distributions fit much better than the Lognormal.


2018 ◽  
Vol 70 (2) ◽  
pp. 136-154
Author(s):  
Suchandan Kayal ◽  
S. M. Sunoj ◽  
B. Vineshkumar

There are several statistical models which have explicit quantile functions, but do not have manageable cumulative distribution functions. For example, Govindarajulu, various forms of lambda, and power-Pareto distributions. Thus, to study the reliability measures for such kind of distributions, a quantile-based tool is essentially required. In this article, we consider quantile version of some well- known reliability measures in the reversed time scale. We study stochastic orders based on the reversed hazard quantile function and the mean inactivity quantile time function. Further, we discuss relative reversed hazard quantile function order, likelihood quantile ratio order, and elasticity quantile order. Connections between the newly proposed orders and the existing stochastic orders are established. AMS 2010 Subject Classification: 60E15, 62E10


Author(s):  
Michael T. Tong

A probabilistic approach is described for aeropropulsion system assessment. To demonstrate this approach, the technical performance of a wave rotor-enhanced gas turbine engine (i.e. engine net thrust, specific fuel consumption, and engine weight) is assessed. The assessment accounts for the uncertainties in component efficiencies/flows and mechanical design variables, using probability distributions. The results are presented in the form of cumulative distribution functions (CDFs) and sensitivity analyses, and are compared with those from the traditional deterministic approach. The comparison shows that the probabilistic approach provides a more realistic and systematic way to assess an aeropropulsion system.


2018 ◽  
Author(s):  
David M. Hyman ◽  
Andrea Bevilacqua ◽  
Marcus I. Bursik

Abstract. The study of volcanic mass flow hazards in a probabilistic framework centers around systematic experimental numerical modelling of the hazardous phenomenon and the subsequent generation and interpretation of a probabilistic hazard map (PHM). For a given volcanic flow (e.g., lava flow, lahar, pyroclastic flow, etc.), the PHM is typically interpreted as the point-wise probability of flow material inundation. In the current work, we present new methods for calculating spatial representations of the mean, standard deviation, median, and modal locations of the hazard's boundary as ensembles of many deterministic runs of a physical model. By formalizing its generation and properties, we show that a PHM may be used to construct these statistical measures of the hazard boundary which have been unrecognized in previous probabilistic hazard analyses. Our formalism shows that a typical PHM for a volcanic mass flow not only gives the point-wise inundation probability, but also represents a set of cumulative distribution functions for the location of the inundation boundary with a corresponding set of probability density functions. These distributions run over curves of steepest ascent on the PHM. Consequently, 2D space curves can be constructed on the map which represent the mean, median and modal locations of the likely inundation boundary. These curves give well-defined answers to the question of the likely boundary location of the area impacted by the hazard. Additionally, methods of calculation for higher moments including the standard deviation are presented which take the form of map regions surrounding the mean boundary location. These measures of central tendency and variance add significant value to spatial probabilistic hazard analyses, giving a new statistical description of the probability distributions underlying PHMs. The theory presented here may be used to construct improved hazard maps, which could prove useful for planning and emergency management purposes. This formalism also allows for application to simplified processes describable by analytic solutions. In that context, the connection between the PHM, its moments, and the underlying parameter variation is explicit, allowing for better source parameter estimation from natural data, yielding insights about natural controls on those parameters.


2019 ◽  
Vol 19 (7) ◽  
pp. 1347-1363 ◽  
Author(s):  
David M. Hyman ◽  
Andrea Bevilacqua ◽  
Marcus I. Bursik

Abstract. The study of volcanic flow hazards in a probabilistic framework centers around systematic experimental numerical modeling of the hazardous phenomenon and the subsequent generation and interpretation of a probabilistic hazard map (PHM). For a given volcanic flow (e.g., lava flow, lahar, pyroclastic flow, ash cloud), the PHM is typically interpreted as the point-wise probability of inundation by flow material. In the current work, we present new methods for calculating spatial representations of the mean, standard deviation, median, and modal locations of the hazard's boundary as ensembles of many deterministic runs of a physical model. By formalizing its generation and properties, we show that a PHM may be used to construct these statistical measures of the hazard boundary which have been unrecognized in previous probabilistic hazard analyses. Our formalism shows that a typical PHM for a volcanic flow not only gives the point-wise inundation probability, but also represents a set of cumulative distribution functions for the location of the inundation boundary with a corresponding set of probability density functions. These distributions run over curves of steepest probability gradient ascent on the PHM. Consequently, 2-D space curves can be constructed on the map which represents the mean, median, and modal locations of the likely inundation boundary. These curves give well-defined answers to the question of the likely boundary location of the area impacted by the hazard. Additionally, methods of calculation for higher moments including the standard deviation are presented, which take the form of map regions surrounding the mean boundary location. These measures of central tendency and variance add significant value to spatial probabilistic hazard analyses, giving a new statistical description of the probability distributions underlying PHMs. The theory presented here may be used to aid construction of improved hazard maps, which could prove useful for planning and emergency management purposes. This formalism also allows for application to simplified processes describable by analytic solutions. In that context, the connection between the PHM, its moments, and the underlying parameter variation is explicit, allowing for better source parameter estimation from natural data, yielding insights about natural controls on those parameters.


2015 ◽  
Vol 28 (8) ◽  
pp. 3289-3310 ◽  
Author(s):  
Liang Ning ◽  
Emily E. Riddle ◽  
Raymond S. Bradley

Projections of historical and future changes in climate extremes are examined by applying the bias-correction spatial disaggregation (BCSD) statistical downscaling method to five general circulation models (GCMs) from phase 5 of the Coupled Model Intercomparison Project (CMIP5). For this analysis, 11 extreme temperature and precipitation indices that are relevant across multiple disciplines (e.g., agriculture and conservation) are chosen. Over the historical period, the simulated means, variances, and cumulative distribution functions (CDFs) of each of the 11 indices are first compared with observations, and the performance of the downscaling method is quantitatively evaluated. For the future period, the ensemble average of the five GCM simulations points to more warm extremes, fewer cold extremes, and more precipitation extremes with greater intensities under all three scenarios. The changes are larger under higher emissions scenarios. The inter-GCM uncertainties and changes in probability distributions are also assessed. Changes in the probability distributions indicate an increase in both the number and interannual variability of future climate extreme events. The potential deficiencies of the method in projecting future extremes are also discussed.


2020 ◽  
Vol 501 (1) ◽  
pp. 994-1001
Author(s):  
Suman Sarkar ◽  
Biswajit Pandey ◽  
Snehasish Bhattacharjee

ABSTRACT We use an information theoretic framework to analyse data from the Galaxy Zoo 2 project and study if there are any statistically significant correlations between the presence of bars in spiral galaxies and their environment. We measure the mutual information between the barredness of galaxies and their environments in a volume limited sample (Mr ≤ −21) and compare it with the same in data sets where (i) the bar/unbar classifications are randomized and (ii) the spatial distribution of galaxies are shuffled on different length scales. We assess the statistical significance of the differences in the mutual information using a t-test and find that both randomization of morphological classifications and shuffling of spatial distribution do not alter the mutual information in a statistically significant way. The non-zero mutual information between the barredness and environment arises due to the finite and discrete nature of the data set that can be entirely explained by mock Poisson distributions. We also separately compare the cumulative distribution functions of the barred and unbarred galaxies as a function of their local density. Using a Kolmogorov–Smirnov test, we find that the null hypothesis cannot be rejected even at $75{{\ \rm per\ cent}}$ confidence level. Our analysis indicates that environments do not play a significant role in the formation of a bar, which is largely determined by the internal processes of the host galaxy.


Sign in / Sign up

Export Citation Format

Share Document