Experimental Error Measurement in Monte Carlo Simulation

Author(s):  
Lucia Cassettari ◽  
Roberto Mosca ◽  
Roberto Revetria

This chapter describes the set up step series, developed by the Genoa Research Group on Production System Simulation at the beginning of the ’80s, as a sequence, through which it is possible at first statistically validate the simulator, then estimate the variables which effectively affect the different target functions, then obtain, through the regression meta-models, the relations linking the independent variables to the dependent ones (target functions) and, finally, proceed to the detection of the optimal functioning conditions. The authors pay great attention to the treatment, the evaluation and control of the Experimental Error, under the form of Mean Square Pure Error (MSPE), a measurement which is always culpably neglected in the traditional experimentation on the simulation models but, that potentially can consistently invalidate with its magnitude the value of the results obtained from the model.

Methodology ◽  
2012 ◽  
Vol 8 (3) ◽  
pp. 97-103 ◽  
Author(s):  
Constance A. Mara ◽  
Robert A. Cribbie ◽  
David B. Flora ◽  
Cathy LaBrish ◽  
Laura Mills ◽  
...  

Randomized pretest, posttest, follow-up (RPPF) designs are often used for evaluating the effectiveness of an intervention. These designs typically address two primary research questions: (1) Do the treatment and control groups differ in the amount of change from pretest to posttest? and (2) Do the treatment and control groups differ in the amount of change from posttest to follow-up? This study presents a model for answering these questions and compares it to recently proposed models for analyzing RPPF designs due to Mun, von Eye, and White (2009) using Monte Carlo simulation. The proposed model provides increased power over previous models for evaluating group differences in RPPF designs.


2005 ◽  
Vol 108 (3-4) ◽  
pp. 199-205 ◽  
Author(s):  
S. Karsten ◽  
G. Rave ◽  
J. Krieter

1988 ◽  
Vol 55 (4) ◽  
pp. 911-917 ◽  
Author(s):  
L. G. Paparizos ◽  
W. D. Iwan

The nature of the response of strongly yielding systems subjected to random excitation, is examined. Special attention is given to the drift response, defined as the sum of yield increments associated with inelastic response. Based on the properties of discrete Markov process models of the yield increment process, it is suggested that for many cases of practical interest, the drift can be considered as a Brownian motion. The approximate Gaussian distribution and the linearly divergent mean square value of the process, as well as an expression for the probability distribution of the peak drift response, are obtained. The validation of these properties is accomplished by means of a Monte Carlo simulation study.


2015 ◽  
Vol 2 (1) ◽  
pp. 97
Author(s):  
Robert Anderson ◽  
Zhou Wei ◽  
Ian Cox ◽  
Malcolm Moore ◽  
Florence Kussener

Design of Experiments (DoE) is widely used in design, manufacturing and quality management. The resulting data is usually analysed with multiple linear regression to generate polynomial equations that describe the relationship between process inputs and outputs. These equations enable us to understand how input values affect the predicted value of one or more outputs and find good set points for the inputs. However, to develop robust manufacturing processes, we also need to understand how variation in these inputs appears as variation in the output. This understanding allows us to define set points and control tolerances for the inputs that will keep the outputs within their required specification windows. Tolerance analysis provides a powerful way of finding input settings and ranges that minimise output variation to produce a process that is robust. In many practical applications, tolerance analysis exploits Monte Carlo simulation of the polynomial model generated from DoE’s. This paper briefly describes tolerance analysis and then shows how Monte Carlo simulation experiments using space-filling designs can be used to find the input settings that result in a robust process. Using this approach, engineers can quickly and easily identify the key inputs responsible for transferring undesired variation to their process outputs and identify the set points and ranges that make their process as robust as possible. If the process is not sufficiently robust, they can rationally investigate different strategies to improve it. A case study approach is used to aid explanation and understanding.


2020 ◽  
Vol 98 (Supplement_4) ◽  
pp. 190-191
Author(s):  
Jaelyn Whaley ◽  
Warrie Means ◽  
John Ritten ◽  
Tom Murphy ◽  
Cody Gifford ◽  
...  

Abstract Carcass characteristics and economic impact estimates of over-finished lambs on the processing sector were evaluated in two commercial Intermountain West abattoirs. Lamb carcasses were surveyed throughout the year using digital images and imaging software (n = 9,532). Estimations of abattoir costs and returns included loading labor, downtime cost, price of fat, live and carcass trucking costs from the two largest lamb processors in the Intermountain West. Profitability comparisons were made using Monte Carlo simulation models replicating live and carcass prices for distributions based on historical pricing data to assess overall profitability of a carcass in an ideal weight range (29.5 - 39.0 kg) and a carcass that exceeds ideal weight (> 39.0 kg). Overall means show that the average lamb carcass exceeded packer preferred hot carcass weight (40.76 ± 9.29 kg) and industry acceptable 12th rib fat thickness (8.17 ± 3.79 mm). There were seasonal differences in hot carcass weight and fat measurements with carcasses being lighter weight (P = 0.05) and trimmer (P = 0.05) in the summer months. Monte Carlo simulation found that the additional yield from heavier carcasses offset costs of harvesting them. However, factors such as machine wear and increased labor turnover rates should be considered, although difficult to quantify. Collectively, the current study shows that U.S. lamb carcasses are too heavy and excessively fat but have minor effect on processor profitability.


2021 ◽  
Vol 59 (12) ◽  
pp. 921-925
Author(s):  
Jeongkwon Kwak ◽  
Boravy Muth ◽  
Hyeon-Woo Yang ◽  
Chang Je Park ◽  
Woo Seung Kang ◽  
...  

Radiation causes damage to the human body, the environment, and electronic equipment. Shielding against neutron and gamma rays is particularly difficult because of their strong ability to penetrate materials. Conventional gamma ray shields are typically made of materials containing Pb. However, they pose problems in that Pb is a heavy metal, and human poisoning and/or pollution can result from the manufacturing, use, and disposal of these materials. In addition, neutron rays are shielded by materials rich in H2 or concrete. In the case of the latter, the manufacturing cost is high. Thus, it is necessary to develop a new multilayer structure that can shield against both neutron and gamma rays. We set up a simulation model of a multilayered structure consisting of metal hydrides and heavy metals, and then evaluated the simulations using Monte Carlo N-Particle Transport Code. Monte Carlo simulation is an accurate method for simulating the interaction between radiation and materials, and can be applied to the transport of radiation particles to predict values such as flux, energy spectrum, and energy deposition. The results of the study indicated the multilayer structure of ZrH2, U, and W could shield both neutron and gamma rays, thus showing potential as a new shielding material to replace Pb and concrete.


Sign in / Sign up

Export Citation Format

Share Document