scholarly journals Application of Univariate Probability Distributions Fitting With Monte Carlo Simulation

2022 ◽  
Vol 5 (4) ◽  
Author(s):  
Muhammad Ilyas ◽  
Shaheen Abbas ◽  
Afzal Ali

In this study, we present a univariate probability distribution through application of the three Sub and Super Exponential heavier-longer and lighter-shorter tails fitting. This univariate family includes the Lognormal, Gamma and Weibull distribution, the adequacy of the distribution tails is obtained by adequate Fitting Tests and descriptive Criterion. It emphasizes on tail values and is independent of the number of intervals. In this regards the time series analysis for the last three centuries of the logarithm population data sets over to Karachi region (from1729 to1946 and from 1951 to 2018) is used, which contains irregular and regular length and peaks, That peaks /tails fitting is attained by methods for validation and normality tests and defined by stochastic depiction. In other hand, Weibull and Lognormal distribution tails are found as heavier distribution by two validation tests (Maximum Likelihood Estimation and probability of correct selection), In the final section, the univariate probability distributions are used to Monte Carlo simulation for generating the actual population data, it indicates that the heavy-tailed Lognormal and Weibull distributions are also fitted contract than the more commonly seen lighter tailed Gamma distribution. So, the Monte Carlo Simulation performs the appropriate Lognormal and Weibull distributions for irregular and regular data and generate data values (298 and 69) from duration of 1729 to 2020 and 1951 to 2020. Copyright(c) The Author

2017 ◽  
Vol 3 (1) ◽  
pp. 27-38
Author(s):  
Zvonko Merkaš ◽  
Davor Perkov ◽  
Petra Miličević

Abstract The paper analyzes the possibilities of using statistical simulation in the macroeconomic risks measurement. At the level of the whole world, macroeconomic risks are, due to the excessive imbalance, significantly increased. Using analytical statistical methods and Monte Carlo simulation, the authors interpret the collected data sets, compare and analyze them in order to mitigate potential risks. The empirical part of the study is a qualitative case study that uses statistical methods and Monte Carlo simulation for managing macroeconomic risks, which is the central theme of this work. Application of statistical simulation is necessary because the system, for which it is necessary to specify the model, is too complex for an analytical approach. The objective of the paper is to point out the previous need for consideration of significant macroeconomic risks, particularly in terms of the number of the unemployed in the society, the movement of gross domestic product and the country’s credit rating, and the use of data previously processed by statistical methods, through statistical simulation, to analyze the existing model of managing the macroeconomic risks and suggest elements for a management model development that will allow, with the lowest possible probability and consequences, the emergence of the recent macroeconomic risks. The stochastic characteristics of the system, defined by random variables as input values defined by probability distributions, require the performance of a large number of iterations on which to record the output of the model and calculate the mathematical expectations. The paper expounds the basic procedures and techniques of discrete statistical simulation applied to systems that can be characterized by a number of events which represent a set of circumstances that have caused a change in the system’s state and the possibility of its application in the field of assessment of macroeconomic risks. The method has no limitations. It can be used to study very complex systems by using special computer programs. The method uses reasonable estimates for important economic inputs to determine a set of results, not only one outcome at one point in time, yet there is a multiple-possibilities estimation of a certain risk performance, regarding the range of economic variables used in the model. Attempt to influence and influence itself on certain macroeconomic risks, in today’s economy, occupies one of the primary imperatives of the world, therefore, this paper deals with the mutual correlation and application of statistical simulations in macroeconomic risks measurement in order to better prevention and remediation.


2012 ◽  
Vol 53 ◽  
Author(s):  
Gintautas Jakimauskas ◽  
Leonidas Sakalauskas

The efficiency of adding an auxiliary regression variable to the logit model in estimation of small probabilities in large populations is considered. Let us consider two models of distribution of unknown probabilities: the probabilities have gamma distribution (model (A)), or logits of the probabilities have Gaussian distribution (model (B)). In modification of model (B) we will use additional regression variable for Gaussian mean (model (BR)). We have selected real data from Database of Indicators of Statistics Lithuania – Working-age persons recognized as disabled for the first time by administrative territory, year 2010 (number of populations K = 60). Additionally, we have used average annual population data by administrative territory. The auxiliary regression variable was based on data – Number of hospital discharges by administrative territory, year 2010. We obtained initial parameters using simple iterative procedures for models (A), (B) and (BR). At the second stage we performed various tests using Monte-Carlo simulation (using models (A), (B) and (BR)). The main goal was to select an appropriate model and to propose some recommendations for using gamma and logit (with or without auxiliary regression variable) models for Bayesian estimation. The results show that a Monte Carlo simulation method enables us to determine which estimation model is preferable.


2020 ◽  
Vol 41 (2) ◽  
pp. 219-229 ◽  
Author(s):  
Ricardo Hideaki Miyajima ◽  
Paulo Torres Fenner ◽  
Gislaine Cristina Batistela ◽  
Danilo Simões

The processing of Eucalyptus logs is a stage that follows the full tree system in mechanized forest harvesting, commonly performed by grapple saw. Therefore, this activity presents some associated uncertainties, especially regarding technical and silvicultural factors that can affect productivity and production costs. To get around this problem, Monte Carlo simulation can be applied, or rather a technique that allows to measure the probabilities of values from factors that are under conditions of uncertainties, to which probability distributions are attributed. The objective of this study was to apply the Monte Carlo method for determining the probabilistic technical-economical coefficients of log processing using two different grapple saw models. Field data were obtained from an area of forest planted with Eucalyptus, located in the State of São Paulo, Brazil. For the technical analysis, the time study protocol was applied by the method of continuous reading of the operational cycle elements, which resulted in production. As for the estimated cost of programmed hour, the applied methods were recommended by the Food and Agriculture Organization of the United Nations. The incorporation of the uncertainties was carried out by applying the Monte Carlo simulation method, by which 100,000 random values were generated. The results showed that the crane empty movement is the operational element that most impacts the total time for processing the logs; the variables that most influence the productivity are specific to each grapple saw model; the difference of USD 0.04 m3 in production costs was observed between processors with gripping area of 0.58 m2 and 0.85 m2. The Monte Carlo method proved to be an applicable tool for mechanized wood harvesting for presenting a range of probability of occurrences for the operational elements and for the production cost.


2020 ◽  
Vol 10 (12) ◽  
pp. 4229 ◽  
Author(s):  
Alexander Heilmeier ◽  
Michael Graf ◽  
Johannes Betz ◽  
Markus Lienkamp

Applying an optimal race strategy is a decisive factor in achieving the best possible result in a motorsport race. This mainly implies timing the pit stops perfectly and choosing the optimal tire compounds. Strategy engineers use race simulations to assess the effects of different strategic decisions (e.g., early vs. late pit stop) on the race result before and during a race. However, in reality, races rarely run as planned and are often decided by random events, for example, accidents that cause safety car phases. Besides, the course of a race is affected by many smaller probabilistic influences, for example, variability in the lap times. Consequently, these events and influences should be modeled within the race simulation if real races are to be simulated, and a robust race strategy is to be determined. Therefore, this paper presents how state of the art and new approaches can be combined to modeling the most important probabilistic influences on motorsport races—accidents and failures, full course yellow and safety car phases, the drivers’ starting performance, and variability in lap times and pit stop durations. The modeling is done using customized probability distributions as well as a novel “ghost” car approach, which allows the realistic consideration of the effect of safety cars within the race simulation. The interaction of all influences is evaluated based on the Monte Carlo method. The results demonstrate the validity of the models and show how Monte Carlo simulation enables assessing the robustness of race strategies. Knowing the robustness improves the basis for a reasonable determination of race strategies by strategy engineers.


1991 ◽  
Vol 113 (3) ◽  
pp. 253-259
Author(s):  
A. B. Dunwoody

A method is presented for the calculation of the reliability of a structure against drifting ice subject to restrictions on the form of the ice load model and on the form of the probability distributions of the ice feature characteristics. The ice load model must have the form that the ice load is proportional to the product of the characteristics of the impacting ice feature raised to individual powers. Results from a Monte Carlo simulation program are presented to demonstrate that the ice loads for a number of useful ice interaction scenarios can be modeled by an equation of this form. The probability distributions of the ice feature characteristics must be from the log-normal family. A realistic example using publicly available ice data and ice load model is presented.


2019 ◽  
Vol 42 (2) ◽  
pp. 143-166 ◽  
Author(s):  
Renato Santos Silva ◽  
Fernando Ferraz Nascimento

Extreme Value Theory (EVT) is an important tool to predict efficient gains and losses. Its main areas of analyses are economic and environmental. Initially, for that form of event, it was developed the use of patterns of parametric distribution such as Normal and Gamma. However, economic and environmental data presents, in most cases, a heavy-tailed distribution, in contrast to those distributions. Thus, it was faced a great difficult to frame extreme events. Furthermore, it was almost impossible to use conventional models, making predictions about non-observed events, which exceed the maximum of observations. In some situations EVT is used to analyse only the maximum of some dataset, which provide few observations, and in those cases it is more effective to use the r largest-order statistics. This paper aims to propose Bayesian estimators' for parameters of the r largest-order statistics. During the research, it was used Monte Carlo simulation to analyze the data, and it was observed some properties of those estimators, such as mean, variance, bias and Root Mean Square Error (RMSE). The estimation of the parameters provided inference for its parameters and return levels. This paper also shows a procedure to the choice of the r-optimal to the r largest-order statistics, based on the Bayesian approach applying Markov chains Monte Carlo (MCMC). Simulation results reveal that the Bayesian approach has a similar performance to the Maximum Likelihood Estimation, and the applications were developed using the Bayesian approach and showed a gain in accurary compared with otherestimators.


2015 ◽  
Vol 2015 ◽  
pp. 1-12
Author(s):  
Mohammed Alguraibawi ◽  
Habshah Midi ◽  
A. H. M. Rahmatullah Imon

Identification of high leverage point is crucial because it is responsible for inaccurate prediction and invalid inferential statement as it has a larger impact on the computed values of various estimates. It is essential to classify the high leverage points into good and bad leverage points because only the bad leverage points have an undue effect on the parameter estimates. It is now evident that when a group of high leverage points is present in a data set, the existing robust diagnostic plot fails to classify them correctly. This problem is due to the masking and swamping effects. In this paper, we propose a new robust diagnostic plot to correctly classify the good and bad leverage points by reducing both masking and swamping effects. The formulation of the proposed plot is based on the Modified Generalized Studentized Residuals. We investigate the performance of our proposed method by employing a Monte Carlo simulation study and some well-known data sets. The results indicate that the proposed method is able to improve the rate of detection of bad leverage points and also to reduce swamping and masking effects.


2014 ◽  
Vol 687-691 ◽  
pp. 1198-1201
Author(s):  
Bin Liu ◽  
Yi Min Shi ◽  
Jing Cai ◽  
Mo Chen

The Type-II generalized progressively hybrid censored scheme with masked data is presented. Based on masked system lifetime data, using the expectation maximization algorithm and the Quasi-Newton method, we obtain the Maximum Likelihood Estimation (MLE) of the components distribution parameters in the Weibull case. Finally, Monte Carlo simulation is presented to illustrate the effect.


Author(s):  
RS Sinha ◽  
AK Mukhopadhyay

The primary crusher is essential equipment employed for comminuting the mineral in processing plants. Any kind of failure of its components will accordingly hinder the performance of the plant. Therefore, to minimize sudden failures, analysis should be undertaken to improve performance and operational reliability of the crushers and its components. This paper considers the methods for analyzing failure rates of a jaw crusher and its critical components application of a two-parameter Weibull distribution in a mineral processing plant fitted using statistical tests such as goodness of fit and maximum likelihood estimation. Monte Carlo simulation, analysis of variance, and artificial neural network are also applied. Two-parameter Weibull distribution is found to be the best fit distribution using Kolmogorov–Smirnov test. Maximum likelihood estimation method is used to find out the shape and scale parameter of two-parameter Weibull distribution. Monte Carlo simulation generates 40 numbers of shape parameters, scale parameters, and time. Further, 40 numbers of Weibull distribution parameters are evaluated to examine the failure rate, significant difference, and regression coefficient using ANOVA. Artificial neural network with back-propagation algorithm is used to determine R2 and is compared with analysis of variance.


Sign in / Sign up

Export Citation Format

Share Document