scholarly journals Shape Factor Asymptotic Analysis I

2020 ◽  
Vol 11 (2) ◽  
pp. 108
Author(s):  
Frank Xuyan WANG

We proposed using shape factor to distinguish probability distributions, and using relative minimum or maximum values of shape factor to locate distribution parameter allowable ranges for distribution fitting in our previous study. In this paper, the shape factor asymptotic analysis is employed to study such conditional minimum or maximum, to cross validate results found from numerical study and empirical formula we obtained and published earlier. The shape factor defined as kurtosis divided by skewness squared  is characterized as the unique maximum choice of  among all factors  that is greater than or equal to 1 for all probability distributions. For all distributions from a specific distribution family, there may exists  such that. The least upper bound of all such  is defined as the distribution family’s characteristic number. The useful extreme values of the shape factor for various distributions that are found numerically before, the Beta, Kumaraswamy, Weibull, and GB2 distributions are derived using asymptotic analysis. The match of the numerical and the analytical results may arguably be considered proof of each other. The characteristic numbers of these distributions are also calculated. The study of the extreme value of the shape factor, or the shape factor asymptotic analysis, help reveal properties of the original shape factor, and reveal relationship between distributions, such as that between the Kumaraswamy distribution and the Weibull distribution.

2013 ◽  
Vol 397-400 ◽  
pp. 12-15 ◽  
Author(s):  
Xiang Fei Ma ◽  
Di Liang ◽  
Yu Cheng Pan ◽  
Hua Dong Wang

Based on intermittent production line balance problem, Minitab statistical tool is used for data random distribution fitting in this paper and the car body production line model of one company is established and simulated under Flexsim environment, which benefits the line imbalance problem to be found quickly and accurately. Designing optimal program based on project management network analysis and making simulation and evaluation, the productivity is improved effectively and the time to balance and optimize production line is shortened greatly.


Author(s):  
Valentin Raileanu ◽  

The article briefly describes the history and fields of application of the theory of extreme values, including climatology. The data format, the Generalized Extreme Value (GEV) probability distributions with Bock Maxima, the Generalized Pareto (GP) distributions with Point of Threshold (POT) and the analysis methods are presented. Estimating the distribution parameters is done using the Maximum Likelihood Estimation (MLE) method. Free R software installation, the minimum set of required commands and the GUI in2extRemes graphical package are described. As an example, the results of the GEV analysis of a simulated data set in in2extRemes are presented.


2014 ◽  
Vol 52 (1) ◽  
pp. 25-40
Author(s):  
Vitor Augusto Ozaki ◽  
Ricardo Olinda ◽  
Priscila Neves Faria ◽  
Rogério Costa Campos

In any agricultural insurance program, the accurate quantification of the probability of the loss has great importance. In order to estimate this quantity, it is necessary to assume some parametric probability distribution. The objective of this work is to estimate the probability of loss using the theory of the extreme values modeling the left tail of the distribution. After that, the estimated values will be compared to the values estimated under the normality assumption. Finally, we discuss the implications of assuming a symmetrical distribution instead of a more flexible family of distributions when estimating the probability of loss and pricing the insurance contracts. Results show that, for the selected regions, the probability distributions present a relative degree of skewness. As a consequence, the probability of loss is quite different from those estimated supposing the Normal distribution, commonly used by Brazilian insurers.


2016 ◽  
Vol 79 (7) ◽  
pp. 1221-1233 ◽  
Author(s):  
JURGEN CHARDON ◽  
ARNO SWART

ABSTRACT In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.


Extreme hydrological situations constantly disturb the earth activities and life, to envisage such extreme activities we need a system that alarms well on time and recognized the expected danger; to prepare such systems one must have knowledge of the significant factors that are actively responsible for such extreme situations and we should have a reliable statistical technique that helps to prepare a useful model for such systems. In this paper we investigate the historical data of peak flood from several gauging stations of river Jhelum in Kashmir, India. A reliable estimation technique (L-moment) is applied for parametric estimation of the probability distributions and a reliable testing techniques are used to check the accuracy of fitting of the distribution, in additional to that L-moment ratio diagram (LMRD) is used to impart information about fitting of distribution. Log Pearsons-III distribution shows better results and satisfies tests of distribution fitting, same probability distribution is globally accepted for flood forecasting.


Author(s):  
Marion Duclercq ◽  
Daniel Broc

This paper deals with a vibratory problem of fluid-structure interaction. It considers the two-dimensional case of a rigid, smooth and circular cylinder undergoing transverse sinusoidal oscillations and immersed in a viscous fluid otherwise at rest. Our work is focused on the in-line force acting on the cylinder in unsteady laminar flow. The aim is to understand the variations of the force with time according to the configuration of the physical system. For that the analysis will also use an energetic approach based on the power balance. The physical system can be characterized by two non-dimensional numbers: the Reynolds number (Re) compares the importance of the fluid viscosity to its inertia, and the Keulegan-Carpenter number (Kc) measures the amplitude of the cylinder displacement compared to its diameter. First the incompressible Navier-Stokes equations are solved numerically by means of a finite elements method. The flow structure is analyzed by determining the evolution with time and throughout the computational domain of flow quantities, such as pressure field, vorticity field or stream lines. We also calculate the values versus time of the different terms occurring in the mean force balance and power balance. We compare these results for several pairs (Kc, Re) of “extreme” values. Thus it appears three characteristic configurations: the inertial Euler case (Kc≪1 and inviscid fluid), the Stokes case (Kc≪1 and Re≫1) and the drag case (Kc≫1). For these three reference configurations the physical mechanisms operating in the system are identified. But in intermediate cases, particularly when Kc>1, every mechanisms interact. Consequently the evolution of the force acting on the cylinder versus time is more complex and its interpretation becomes less straightforward. That is why a quantitative energetic analysis is carried out. We define a measure of the dissipative energy present in the flow. Then we compare the values of that coefficient for different cases throughout the map (Kc, Re).


2019 ◽  
Vol 2 (1) ◽  
pp. 79-91
Author(s):  
Amy Price ◽  
Maria Yulmetova ◽  
Sarah Khalil

AbstractIce management is critical for safe and efficient operations in ice-covered waters; thus, it is important to understand the impact of the operator’s experience in effective ice management performance. This study evaluated the confidence intervals of the mean and probability distributions of two different sample groups, novice cadets and experienced seafarers, to evaluate if there was a difference in effective ice management depending on the operator’s level of experience. The ice management effectiveness, in this study, is represented by the “clearing-to-distance ratio” that is the ratio between the area of cleared ice (km2) and the distance travelled by an ice management vessel (km) to maintain that cleared area. The data analysed in this study was obtained from a recent study conducted by Memorial University’s “Safety at Sea” research group. With the distribution fitting analysis providing inconclusive results regarding the normality of the data, the confidence intervals of the dataset means were obtained using both parametric approaches, such as t-test, Cox’s method, and Johnson t-approach, and non-parametric methods, namely Jackknife and Bootstrap methods, to examine if the assumption of normality was valid. The comparison of the obtained confidence interval results demonstrates that the mean efficiency of the cadets is more consistent, while it is more varied among seafarers. The noticeable difference in ice management performance between the cadet and seafarer sample groups is revealed, thus, proving that crew experience positively influences ice management effectiveness.


Author(s):  
M. M. JOGLEKAR ◽  
N. RAMAKRISHNAN ◽  
SUDHAKAR INGUVA ◽  
SAMPATH K. VANIMISETTI ◽  
RYAN B. MOULLIET

A numerical study is performed on the cyclic capacity degradation of a lithium manganese oxide (LMO) cell, under 21 different combinations of temperature and state of charge (SOC), based on the phenomenological model developed earlier. Out of the 21 sets, six are used for fitting in order to establish the degradation parameters of the model and the rest could be predicted with an average accuracy of about 90%. Two optimization algorithms (Genetic and Nelder Mead) are used and the consistency of the convergence is verified. The discussion includes sensitivity analysis of a selected set of degradation parameters. In addition, an analysis of the evolution of solid electrolyte interphase (SEI) and isolation (islanding) mechanisms under varying conditions of SOC and temperatures is performed which brings out the relative contribution of each mechanism to the overall capacity fade.


Sign in / Sign up

Export Citation Format

Share Document