On Effect of Model Parameters on Departure Process in a Production System with Failures

2014 ◽  
Vol 1036 ◽  
pp. 927-932 ◽  
Author(s):  
Wojciech M. Kempa ◽  
Iwona Paprocka ◽  
Cezary Grabowik ◽  
Krzysztof Kalinowski

A queueing system of the M/M/1/N type with cyclic failure-free and repair times is used as a model of a single-machine manufacturing line. Jobs arrive according to a Poisson process and are being served with exponentially distributed processing time. Successive working (failure-free) and repair times have exponential distributions, too. Basing on a system of integral equations for double transforms of conditional probability distributions of the number of jobs completely processed before the fixed time (departure process), comprehensive numerical analysis of the impact of system parameters on the mean number of departures before the fixed epoch T>0 is carried out.

1997 ◽  
Vol 161 ◽  
pp. 197-201 ◽  
Author(s):  
Duncan Steel

AbstractWhilst lithopanspermia depends upon massive impacts occurring at a speed above some limit, the intact delivery of organic chemicals or other volatiles to a planet requires the impact speed to be below some other limit such that a significant fraction of that material escapes destruction. Thus the two opposite ends of the impact speed distributions are the regions of interest in the bioastronomical context, whereas much modelling work on impacts delivers, or makes use of, only the mean speed. Here the probability distributions of impact speeds upon Mars are calculated for (i) the orbital distribution of known asteroids; and (ii) the expected distribution of near-parabolic cometary orbits. It is found that cometary impacts are far more likely to eject rocks from Mars (over 99 percent of the cometary impacts are at speeds above 20 km/sec, but at most 5 percent of the asteroidal impacts); paradoxically, the objects impacting at speeds low enough to make organic/volatile survival possible (the asteroids) are those which are depleted in such species.


1995 ◽  
Vol 46 (1) ◽  
pp. 359 ◽  
Author(s):  
J Persson ◽  
L Hakanson

Bottom dynamic conditions (areas of accumulation, erosion or transportation) in aquatic ecosystems influence the dispersal, sedimentation and recirculation of most substances, such as metals, organic toxins and nutrients. The aim of the present work was to establish a simple and general method to predict sediment types/bottom dynamic conditions in Baltic coastal areas. As a working hypothesis, it is proposed that the morphometry and the absence or presence of an archipelago outside a given coastal area regulate what factors determine the prevailing bottom dynamic conditions. Empirical data on the proportion of accumulation bottoms (BA) were collected from 38 relatively small (1-14 km²) and enclosed coastal areas in the Baltic Sea. Morphometric data were obtained by using a digital technique to transfer information from standard bathymetric maps into a computer. Data were processed by means of multivariate statistical methods. In the first model, based on data from all 38 areas, 55% of the variation in BA among the areas was statistically explained by five morphometric parameters. The data set was then divided into two parts: areas in direct connection with the open sea, and areas inside an archipelago. In the second model, based on data from 15 areas in direct connection with the open sea, 77% of the variation in BA was statistically explained by the mean depth of the deep water (the water mass below 10 m) and the mean slope. In the third model, based on data from 23 areas inside an archipelago, 70% of the variation in BA was statistically explained by the mean slope, the topographic form factor, the proportion of islands and the mean filter factor (which is a relative measure of the impact of winds and waves from outside the area). The model parameters describe the sediment trapping capacity of the areas investigated.


2011 ◽  
Vol 31 (4) ◽  
pp. 530-539 ◽  
Author(s):  
Karen M. Kuntz ◽  
Iris Lansdorp-Vogelaar ◽  
Carolyn M. Rutter ◽  
Amy B. Knudsen ◽  
Marjolein van Ballegooijen ◽  
...  

Background. As the complexity of microsimulation models increases, concerns about model transparency are heightened. Methods. The authors conducted model “experiments” to explore the impact of variations in “deep” model parameters using 3 colorectal cancer (CRC) models. All natural history models were calibrated to match observed data on adenoma prevalence and cancer incidence but varied in their underlying specification of the adenocarcinoma process. The authors projected CRC incidence among individuals with an underlying adenoma or preclinical cancer v. those without any underlying condition and examined the impact of removing adenomas. They calculated the percentage of simulated CRC cases arising from adenomas that developed within 10 or 20 years prior to cancer diagnosis and estimated dwell time—defined as the time from the development of an adenoma to symptom-detected cancer in the absence of screening among individuals with a CRC diagnosis. Results. The 20-year CRC incidence among 55-year-old individuals with an adenoma or preclinical cancer was 7 to 75 times greater than in the condition-free group. The removal of all adenomas among the subgroup with an underlying adenoma or cancer resulted in a reduction of 30% to 89% in cumulative incidence. Among CRCs diagnosed at age 65 years, the proportion arising from adenomas formed within 10 years ranged between 4% and 67%. The mean dwell time varied from 10.6 to 25.8 years. Conclusions. Models that all match observed data on adenoma prevalence and cancer incidence can produce quite different dwell times and very different answers with respect to the effectiveness of interventions. When conducting applied analyses to inform policy, using multiple models provides a sensitivity analysis on key (unobserved) “deep” model parameters and can provide guidance about specific areas in need of additional research and validation.


Author(s):  
Yuri Popkov ◽  
Yuri Dubnov ◽  
Alexey Popkov

The paper is devoted to the forecasting of the COVID-19 epidemic by the novel method of randomized machine learning. This method is based on the idea of estimation of probability distributions of model parameters and noises on real data. Entropy-optimal distributions correspond to the state of maximum uncertainty which allows the resulting forecasts to be used as forecasts of the most "negative" scenario of the process under study. The resulting estimates of parameters and noises, which are probability distributions, must be generated, thus obtaining an ensemble of trajectories that considered to be analyzed by statistical methods. In this work, for the purposes of such an analysis, the mean and median trajectories over the ensemble are calculated, as well as the trajectory corresponding to the mean over distribution values of the model parameters. The proposed approach is used to predict the total number of infected people using a three-parameter logistic growth model. The conducted experiment is based on real COVID-19 epidemic data in several countries of the European Union. The main goal of the experiment is to demonstrate an entropy-randomized approach for predicting the epidemic process based on real data near the peak. The significant uncertainty contained in the available real data is modeled by an additive noise within 30%, which is used both at the training and predicting stages. To tune the hyperparameters of the model, the scheme is used to configure them according to a testing dataset with subsequent retraining of the model. It is shown that with the same datasets, the proposed approach makes it possible to predict the development of the epidemic more efficiently in comparison with the standard approach based on the least-squares method.


Mathematics ◽  
2021 ◽  
Vol 9 (24) ◽  
pp. 3283
Author(s):  
Mustafa Demircioglu ◽  
Herwig Bruneel ◽  
Sabine Wittevrongel

Queueing models with disasters can be used to evaluate the impact of a breakdown or a system reset in a service facility. In this paper, we consider a discrete-time single-server queueing system with general independent arrivals and general independent service times and we study the effect of the occurrence of disasters on the queueing behavior. Disasters occur independently from time slot to time slot according to a Bernoulli process and result in the simultaneous removal of all customers from the queueing system. General probability distributions are allowed for both the number of customer arrivals during a slot and the length of the service time of a customer (expressed in slots). Using a two-dimensional Markovian state description of the system, we obtain expressions for the probability, generating functions, the mean values, variances and tail probabilities of both the system content and the sojourn time of an arbitrary customer under a first-come-first-served policy. The customer loss probability due to a disaster occurrence is derived as well. Some numerical illustrations are given.


2019 ◽  
Vol 2 (1) ◽  
pp. 79-91
Author(s):  
Amy Price ◽  
Maria Yulmetova ◽  
Sarah Khalil

AbstractIce management is critical for safe and efficient operations in ice-covered waters; thus, it is important to understand the impact of the operator’s experience in effective ice management performance. This study evaluated the confidence intervals of the mean and probability distributions of two different sample groups, novice cadets and experienced seafarers, to evaluate if there was a difference in effective ice management depending on the operator’s level of experience. The ice management effectiveness, in this study, is represented by the “clearing-to-distance ratio” that is the ratio between the area of cleared ice (km2) and the distance travelled by an ice management vessel (km) to maintain that cleared area. The data analysed in this study was obtained from a recent study conducted by Memorial University’s “Safety at Sea” research group. With the distribution fitting analysis providing inconclusive results regarding the normality of the data, the confidence intervals of the dataset means were obtained using both parametric approaches, such as t-test, Cox’s method, and Johnson t-approach, and non-parametric methods, namely Jackknife and Bootstrap methods, to examine if the assumption of normality was valid. The comparison of the obtained confidence interval results demonstrates that the mean efficiency of the cadets is more consistent, while it is more varied among seafarers. The noticeable difference in ice management performance between the cadet and seafarer sample groups is revealed, thus, proving that crew experience positively influences ice management effectiveness.


2018 ◽  
Author(s):  
Alex G. Libardoni ◽  
Chris E. Forest ◽  
Andrei P. Sokolov ◽  
Erwan Monier

Abstract. For over twenty years, the Massachusetts Institute of Technology Earth System Model (MESM) has been used extensively for climate change research. The model is under continuous development with components being added or updated. To provide transparency in the model development, we perform a baseline evaluation of the newest version by comparing model behavior and properties to the previous model version. In particular, the impacts resulting from updates to the land surface model component and the input forcings used in historical simulations of climate change are investigated. We run an 1800-member ensemble of MESM historical climate simulations where the model parameters that set climate sensitivity, ocean heat uptake, and the net anthropogenic aerosol forcing are systematically varied. By comparing model output to observed patterns of surface temperature changes, the linear trend in the increase in ocean heat content, and upper-air temperature changes, we derive probability distributions for the three model parameters. Furthermore, we run a 372-member ensemble of transient climate simulations where model forcings are held fixed, absent an increase in carbon dioxide concentrations at the rate of 1 % per year. From these runs, we derive a response surface for transient climate response and thermosteric sea level rise as a function of climate sensitivity and ocean heat uptake. We compare the probability distributions and response surfaces derived using the current version of MESM to the preceding version to evaluate the impact of the updated land surface model and forcing suite. We show that the probability distributions shift towards higher climate sensitivities and weaker aerosol forcing in response to the new forcing suite. The climate response surfaces are relatively unchanged between model versions, indicating that the updated land surface model has limited impact on temperature evolution in the model.


2014 ◽  
Vol 2014 ◽  
pp. 1-10 ◽  
Author(s):  
Veena Goswami

This paper analyzes customers’ impatience in Markovian queueing system with multiple working vacations and Bernoulli schedule vacation interruption, where customers’ impatience is due to the servers’ vacation. During the working vacation period, if there are customers in the queue, the vacation can be interrupted at a service completion instant and the server begins a regular busy period with probability 1-q or continues the vacation with probability q. We obtain the probability generating functions of the stationary state probabilities and deduce the explicit expressions of the system sizes when the server is in a normal service period and in a Bernoulli schedule vacation interruption, respectively. Various performance measures such as the mean system size, the proportion of customers served, the rate of abandonment due to impatience, and the mean sojourn time of a customer served are derived. We obtain the stochastic decomposition structures of the queue length and waiting time. Finally, some numerical results to show the impact of model parameters on performance measures of the system are presented.


2014 ◽  
Vol 1036 ◽  
pp. 846-851 ◽  
Author(s):  
Wojciech M. Kempa ◽  
Iwona Paprocka ◽  
Krzysztof Kalinowski ◽  
Cezary Grabowik

A finite-buffer queueing system of the M/M/1/N type is used for modeling the operation of a single-machine production line with cyclic failure-free and repair periods. The arriving jobs enter randomly according to a Poisson process and are being processed individually with service times having the common exponential distribution. After an exponentially distributed working period a breakdown of the machine occurs, starting an exponentially distributed repair time during which the service process is stopped. At the completion epoch of the repair time a new working period begins and so on. A system of integral equations for conditional probability distributions of the number of jobs completely processed before the fixed time t (departure process) is built, using the concept of embedded Markov chain and the total probability law. Applying linear-algebraic approach the compact-form solution of the corresponding system written for double transforms of departure process is found.


2012 ◽  
Vol 9 (8) ◽  
pp. 2889-2904 ◽  
Author(s):  
I. G. Enting ◽  
P. J. Rayner ◽  
P. Ciais

Abstract. Characterisation of estimates of regional carbon budgets and processes is inherently a statistical task. In full form this means that almost all quantities used or produced are realizations or instances of probability distributions. We usually compress the description of these distributions by using some kind of location parameter (e.g. the mean) and some measure of spread or uncertainty (e.g. the standard deviation). Characterising and calculating these uncertainties, and their structure in space and time, is as important as the location parameter, but uncertainties are both hard to calculate and hard to interpret. In this paper we describe the various classes of uncertainty that arise in a process like RECCAP and describe how they interact in formal estimation procedures. We also point out the impact these uncertainties will have on the various RECCAP synthesis activities.


Sign in / Sign up

Export Citation Format

Share Document