scholarly journals Development of Statistical Tolerancing for Optical Products. (2nd Report). Realization of Complex Tolerance Sensitivity Analysis System.

1999 ◽  
Vol 65 (2) ◽  
pp. 279-284
Author(s):  
Toyoharu SASAKI ◽  
Masahiko SHINKAI ◽  
Kohichiro HIGASHIYAMA ◽  
Fumiki TANAKA ◽  
Takeshi KISHINAMI
2004 ◽  
Vol 17 (5) ◽  
pp. 1-8 ◽  
Author(s):  
Narcyz Ghinea ◽  
James M. van Gelder

Object The goal in this study was to develop an interactive, probabilistic decision-analysis system for clinical use in the decision to treat or observe unruptured intracranial aneurysms. Further goals were to enable users of the system to adapt decision-analysis methods to individual patients and to provide a tool for interactive sensitivity analysis. Methods A computer program was designed to model the outcomes of treatment and observation of unruptured aneurysms. The user supplies probabilistic estimates of key parameters relating to a specific patient and nominates discount rate and quality of life adjustments. The program uses Monte Carlo discrete-event simulation methods to derive probability estimates of the outcomes of treatment and observation. Results are expressed as summary statistics and graphs. Discounted quality-adjusted life years are graphed using survival methods. Hierarchical simulations are used to enable investigators to perform probabilistic sensitivity analysis for one or multiple parameters simultaneously. The results of sensitivity analysis are expressed in graphs and as the expected value of perfect information. The system can be distributed and updated using the Internet. Conclusions Further research is required into the benefits of clinical application of this system. Further research is also required into the optimum level of complexity of the model, into the user interface, and into how clinicians and patients are likely to interpret results. The system is easily adaptable to a range of medical decision analyses.


Agriculture ◽  
2019 ◽  
Vol 9 (7) ◽  
pp. 159
Author(s):  
Braelyn Moltz ◽  
Mark Yu ◽  
Edward Osei ◽  
W. Brandon Smith ◽  
Brant Poe

Cattle placed on feed is a practice to maximize the amount of meat produced before being sent to slaughter, which has become a major agricultural industry. The optimization of input quantities, especially corn, is crucial to maximize production efficiency and ultimately profit. The objective of this research is to determine the optimal corn grain production rate for cattle on feed in Texas and estimated profit maximization under various price ratios for corn grain and live cattle. Utilizing data from various United States Department of Agriculture (USDA) sources, various different input production levels and prices were collected. Statistical Analysis System (SAS), procedures were used to estimate the different production functions. Sensitivity analysis were performed for the optimal production of corn grain rate, and consequent profit under various combinations of corn and live cattle prices for the four different functions. Additionally, a continuous form curve for optimal corn grain production rates under various price ratios was developed. Results indicated that the cubic model was the most accurate based upon the R2 value. However, the continuous form model created for the sensitivity analysis concluded that the quadratic was the most accurate model under the different price ratios. The results of the study can be a useful tool for the decision-making process for producers and policymakers.


2004 ◽  
Vol 19 (1) ◽  
pp. 5-12
Author(s):  
Ervin G. Schuster ◽  
Michael A. Krebs

Abstract A sensitivity analysis was conducted of the National Fire Management Analysis System (NFMAS) to better understand the relationship between data input and model outcomes, as reflected by changes in C+NVC and MEL program options. Five input variables were selected for sensitization: Unit Mission Costs, Average Acre Costs, Net Value Change, Production Rates, and Escaped Fire Limits. A stratified random sample of 32 national forests was selected, according to the distribution of national forests within Forest Service regions and fire frequency classes, on the basis of historical fire data. NFMAS database tables were obtained and manipulated, with each variable increased and decreased at six levels (±25, ±50, and ±100%). Results indicated that Production Rates was always the most influential variable, Unit Mission Costs was always least influential, and the influence of the other variables depends on the choice of model outcome. In general, greater sensitivity changes resulted in greater changes in model outcome, but no consistent pattern of influence could be found regarding program option.West. J. Appl. For. 19(1):5–12.


2002 ◽  
Vol 124 (2) ◽  
pp. 286-295 ◽  
Author(s):  
Shih-Ming Wang ◽  
Kornel F. Ehmann

Precision machining operations necessitate highly accurate, rigid, and stable machine-tool structures. In response to this need, parallel architecture machines, based on the concepts of the Stewart Platform, are emerging. In this paper, considering major inaccuracy factors related to the manufacture, geometry, and kinematics, of such machines, first and second order error models are presented, and followed by a comparative assessment of these models in conjunction with illustrative examples. Furthermore, in order to understand the character and propagation of errors of 6-DOF Stewart Platform based machine tools, sensitivity analysis is adopted to describe the contribution of each error component to the total position and orientation error of the mechanism. An automated error analysis system that computes and graphically depicts the error distributions throughout the workspace along with the results of sensitivity analysis is developed and demonstrated.


Author(s):  
Adel W. Sadek ◽  
Bernard Baah

To help transportation planners, the Federal Highway Administration (FHWA) has recently sponsored the development of a sketch planning analysis tool for estimating the benefits and costs of intelligence transportation system (ITS) deployment. Called the ITS Deployment Analysis System (IDAS), the tool was developed by Cambridge Systematics for FHWA and operates as a postprocessor to traditional planning models that are based on the four-step planning process. IDAS was used to predict the likely benefits of deploying ITS in Chittenden County, Vermont, a medium-sized region in northwestern Vermont, to gain insight into IDAS’ applicability in evaluating ITS benefits. A sensitivity analysis was also conducted to assess the sensitivity of IDAS’ results to the choice of values of some of the model’s parameters. The sensitivity analysis demonstrates that some of the model’s parameters tend to have a more significant impact on the results than others. Finally, the results indicate that, for medium-sized areas similar to Chittenden County, coordination of signals along major arterials and transit projects for automatic vehicle location and scheduling appear to be more cost-effective than freeway and incident management systems.


Author(s):  
S.F. Stinson ◽  
J.C. Lilga ◽  
M.B. Sporn

Increased nuclear size, resulting in an increase in the relative proportion of nuclear to cytoplasmic sizes, is an important morphologic criterion for the evaluation of neoplastic and pre-neoplastic cells. This paper describes investigations into the suitability of automated image analysis for quantitating changes in nuclear and cytoplasmic cross-sectional areas in exfoliated cells from tracheas treated with carcinogen.Neoplastic and pre-neoplastic lesions were induced in the tracheas of Syrian hamsters with the carcinogen N-methyl-N-nitrosourea. Cytology samples were collected intra-tracheally with a specially designed catheter (1) and stained by a modified Papanicolaou technique. Three cytology specimens were selected from animals with normal tracheas, 3 from animals with dysplastic changes, and 3 from animals with epidermoid carcinoma. One hundred randomly selected cells on each slide were analyzed with a Bausch and Lomb Pattern Analysis System automated image analyzer.


Author(s):  
A. V. Crewe ◽  
M. Ohtsuki

We have assembled an image processing system for use with our high resolution STEM for the particular purpose of working with low dose images of biological specimens. The system is quite flexible, however, and can be used for a wide variety of images.The original images are stored on magnetic tape at the microscope using the digitized signals from the detectors. For low dose imaging, these are “first scan” exposures using an automatic montage system. One Nova minicomputer and one tape drive are dedicated to this task.The principal component of the image analysis system is a Lexidata 3400 frame store memory. This memory is arranged in a 640 x 512 x 16 bit configuration. Images are displayed simultaneously on two high resolution monitors, one color and one black and white. Interaction with the memory is obtained using a Nova 4 (32K) computer and a trackball and switch unit provided by Lexidata.The language used is BASIC and uses a variety of assembly language Calls, some provided by Lexidata, but the majority written by students (D. Kopf and N. Townes).


Author(s):  
D.S. DeMiglio

Much progress has been made in recent years towards the development of closed-loop foundry sand reclamation systems. However, virtually all work to date has determined the effectiveness of these systems to remove surface clay and metal oxide scales by a qualitative inspection of a representative sampling of sand particles. In this investigation, particles from a series of foundry sands were sized and chemically classified by a Lemont image analysis system (which was interfaced with an SEM and an X-ray energy dispersive spectrometer) in order to statistically document the effectiveness of a reclamation system developed by The Pangborn Company - a subsidiary of SOHIO.The following samples were submitted: unreclaimed sand; calcined sand; calcined & mechanically scrubbed sand and unused sand. Prior to analysis, each sample was sprinkled onto a carbon mount and coated with an evaporated film of carbon. A backscattered electron photomicrograph of a field of scale-covered particles is shown in Figure 1. Due to a large atomic number difference between sand particles and the carbon mount, the backscattered electron signal was used for image analysis since it had a uniform contrast over the shape of each particle.


Author(s):  
W. O. Saxton

Recent commercial microscopes with internal microprocessor control of all major functions have already demonstrated some of the benefits anticipated from such systems, such as continuous magnification, rotation-free diffraction and magnification, automatic recording of mutually registered focal series, and fewer control knobs. Complete automation of the focusing, stigmating and alignment of a high resolution microscope, allowing focal series to be recorded at preselected focus values as well, is still imminent rather than accomplished, however; some kind of image pick-up and analysis system, fed with the electron image via a TV camera, is clearly essential for this, but several alternative systems and algorithms are still being explored. This paper reviews the options critically in turn, and stresses the need to consider alignment and focusing at an early stage, and not merely as an optional extension to a basic proposal.


Sign in / Sign up

Export Citation Format

Share Document