Standardization of CMM Fitting Algorithms and Development of Inspection Maps for Use in Statistical Process Control

Author(s):  
Neelakantan Mani ◽  
Jami J. Shah ◽  
Joseph K. Davidson

The choice of fitting algorithm in CMM metrology has often been based on mathematical convenience rather than the fundamental GD&T principles dictated by the ASME Y14.5 standard. Algorithms based on the least squares technique are mostly used for GD&T inspection and this wrong choice of fitting algorithm results in errors that are often overlooked and leads to deficiency in the inspection process. The efforts by organizations such as NIST and NPL and many other researchers to evaluate commercial CMM software were concerned with the mathematical correctness of the algorithms and developing efficient and intelligent methods to overcome the inherent difficulties associated with the mathematics of these algorithms. None of these works evaluate the ramifications of the choice of a particular fitting algorithm for a particular tolerance type. To illustrate the errors that can arise out of a wrong choice of fitting algorithm, a case study was done on a simple prismatic part with intentional variations and the algorithms that were employed in the software were reverse engineered. Based on the results of the experiments, a standardization of fitting algorithms is proposed in light of the definition provided in the standard and an interpretation of manual inspection methods. The standardized fitting algorithms developed for substitute feature fitting are then used to develop Inspection maps (i-Maps) for size, orientation and form tolerances that apply to planar feature types. A methodology for Statistical Process Control (SPC) using these i-Maps is developed by fitting the i-Maps for a batch of parts into the parent Tolerance Maps (T-Maps). Different methods of computing the i-Maps for a batch are explored such as the mean, standard deviations, computing the convex hull and doing a principal component analysis of the distribution of the individual parts. The control limits for the process and the SPC and process capability metrics are computed from inspection samples and the resulting i-Maps. Thus, a framework for statistical control of the manufacturing process is developed.

Author(s):  
William E. Odinikuku ◽  
Jephtah A. Ikimi ◽  
Ikechukwu P. Onwuamaeze

In many countries manpower problems in the field of health care are regular items on the agenda of policy makers. To avoid mismatches between demand of care and supply of care on national and regional levels, manpower planning models and methods are used to determine adequate numbers of medical specialists to fulfill the future demand of care. Inadequate or inefficient allocation of manpower to various departments in an organization or workplace can lead to undesired outcomes which may include: down time, reduced productivity, workers fatigue, increased production costs, etc. As a result of the above stated problem, there is need to devise a statistical model that will ensure optimal allocation of manpower. In this study, the optimum allocation of two hundred and fifty two general nurses to fifteen wards at a hospital code named WCH located in South-South geopolitical zone, Nigeria was achieved using statistical process control. The study involved the analysis of data obtained from our hub of study for a period of two months. The C-chart was used to check if the process of allocation was in control or not. The result obtained from the study showed that the manpower allocation process was out of statistical control as the allocation of the children emergency ward was outside the upper control limit of the c-chart plot.


2008 ◽  
Vol 3 (1) ◽  
Author(s):  
Mauricio L. Maestri ◽  
Miryan C. Cassanello ◽  
Gabriel I. Horowitz

The outputs of statistical process control (SPC) tools developed for fault detection are comparatively examined while applied to actual data collected in an industrial plant. The influence of added information gathered from the plant operation under different strategies is analyzed. Particularly, standard principal component analysis (PCA), kernel PCA and the Hotelling's T2 charts are inspected for a reported problem. The effect of training the tools either with an extended historic databank obtained under standard operation, or including also non-conventional conditions, is studied. The ability of the tools to provide a specific alarm and identify the responsible variable is examined by analyzing the contributions per variable to the SPE and the T2 statistics. In addition, the capacity of the tested tools to adapt to a new operation strategy is compared.


2016 ◽  
Vol 15 (3) ◽  
pp. 582 ◽  
Author(s):  
ANTONIO TASSIO SANTANA ORMOND ◽  
MURILO APARECIDO VOLTARELLI ◽  
CARLA SEGATTO STRINI PAIXÃO ◽  
ALINE SPAGGIARI ALCÂNTARA1 ◽  
ELIZABETH HARUNA KAZAMA ◽  
...  

RESUMO - As perdas na colheita podem estar relacionadas tanto a colhedora, como também a fatores ligados a cultura como: mau preparo do solo, densidade de plantas, inadequação da época de semeadura são alguns deles. O presente estudo objetivou determinar a influência da velocidade de semeadura no processo de colheita mecanizada de milho, por meio do controle de qualidade do processo. O experimento foi conduzido em Latossolo Vermelho, textura argilosa e relevo suave ondulado. O delineamento foi baseado na óptica do Controle Estatístico de ProcessoCEP, onde os dados foram coletados em pontos aleatórios em função do tempo. Os indicadores de qualidade avaliados foram divididos em parâmetros de semeadura (população de plantas e distribuição longitudinal de plântulas); e de colheita (Perdas de grãos e distribuição de palha) em função de seis velocidades de deslocamento (aproximadamente 2,0; 4,0; 6,0; 9,0; 10,0 e 12,0 Km.h-1). Os dados foram submetidos a análise descritiva para análise do comportamento. Como ferramentas do controle estatístico de processo utilizou-se, run charts ou gráfico sequencial e carta de controle de valores individuais para análise da qualidade do processo. A maior velocidade (V6) apresentou a maior variabilidade dos dados para todas as variáveis. A operação da colheita mecanizada de milho foi influenciada por fatores extrínsecos e intrínsecos a ela.Palavras-chave: Controle estatístico de processo, espaçamentos normais, perdas, população de plantas.QUALITY IN MECHANIZED HARVEST OF CORN SOWN IN DIFFERENT SPEEDSABSTRACT - The harvest losses may be associated to harvester as well as factors related to cultivation such as poor soil preparation, plant density, unsuitable sowing time. This study aimed to determine the effect of speed sowing in the mechanized harvest of corn, through the control of the quality of the process. The experiment was conducted in a clayey Oxisol and undulate relief. The design was based on the optics of the Statistical Process Control SPC, and the data were collected at random points in function of time. The quality indicators evaluated were divided into sowing parameters (plant population and longitudinal distribution of seedlings) and harvesting (loss of grain and straw distribution) in function of six displacement speeds (approximately 2.0, 4.0, 6.0, 9.0, 10.0 and 12.0 Km.h-1). The data were submitted to descriptive analysis for behavior analysis. As tools for the statistical control of the process, run charts or sequential graph were used, and control chart of individual values for analysis of the quality of the process. The highest speed (V6) showed the highest variability of the data for all variables. The operation of mechanized harvest of corn was influenced by extrinsic and intrinsic factors.Keywords: statistical process control, normal spacings, losses, plant population.


2014 ◽  
Vol 615 ◽  
pp. 118-123 ◽  
Author(s):  
Joaquín Sancho ◽  
Jorge Pastor ◽  
Javier Martínez ◽  
Miguel Angel García

Functional data appear in a multitude of industrial applications and processes. However, in many cases at present, such data continue to be studied from the conventional standpoint based on Statistical Process Control (SPC), losing the capacity of analyzing different aspects over the time. In this study is presented a Statistical Control Process based on functional data analysis to identify outliers or special causes of variability of harmonics appearing in power systems which can negatively impact on quality of electricity supply. The results obtained from the functional approach are compared with those obtained with conventional Statistical Process Control that has been done firstly.


2020 ◽  
Vol 2020 ◽  
pp. 1-11
Author(s):  
Yichen Wang ◽  
Hong Zheng ◽  
Xinyue Lu

Metro construction is normally carried out in complex engineering geological environment, so it can generate various risk events. In the process of metro construction, a scientific risk dynamic analysis is indispensable to reduce and control risks. In order to analyze the risk in metro construction more scientifically and reasonably, in this study, a new risk dynamic analysis method for metro construction is proposed using statistical process control. The method can analyse the risk level according to the process’s capacity index and identify the characteristics of risk variation according to the statistical control chart. The risk level and the characteristic of risks may vary with dynamical updating of monitoring data, so the conclusion of risk evaluation for a time interval can be drawn and corresponding safety measures can be ascertained. The method ushers statistical process control, so the random factors in risk evolution can be considered fully. Then, the method is applied to the risk analysis of shield construction under the Beijing-Tianjin intercity railway in Beijing Metro Line 8, a typical risk problem in the traffic construction. The variation of the risk level and the characteristic of risks can be evaluated reasonably because the dynamical randomness is considered. Moreover, whether risk control measures should be taken and what the effective measures are can be ascertained explicitly.


Author(s):  
Dereje Girma ◽  
Omprakash Sahu

Identifying the presence and understanding the causes of process variability are key requirements for well controlled and quality manufacturing. This pilot study demonstrates the introduction of Statistical Process Control (SPC) methods to the spinning department of a textile manufacturing company. The methods employed included X Bar and R process control charts as well as process capability analysis. Investigation for 29 machine processes identified that none were in statistical control. Recommendations have been made for a repeat of the study using validated data together with practical application of SPC and control charts on the shop floor and extension to all processes within the factory.


Author(s):  
Carrison K.S. Tong ◽  
Eric T.T. Wong

The present study advocates the application of statistical process control (SPC) as a performance monitoring tool for a PACS. The objective of statistical process control (SPC) differs significantly from the traditional QC/QA process. In the traditional process, the QC/QA tests are used to generate a datum point and this datum point is compared to a standard. If the point is out of specification, then action is taken on the product and action may be taken on the process. To move from the traditional QC/QA process to SPC, a process control plan should be developed, implemented, and followed. Implementing SPC in the PACS environment need not be a complex process. However, if the maximum effect is to be achieved and sustained, PACSSPC must be implemented in a systematic manner with the active involvement of all employees from line associates to executive management. SPC involves the use of mathematics, graphics, and statistical techniques, such as control charts, to analyze the PACS process and its output, so as to take appropriate actions to achieve and maintain a state of statistical control. While SPC is extensively used in the healthcare industry, especially in patient monitoring, it is rarely applied in the PACS environment. One may refer to a recent SPC application that Mercy Hospital (Alegent Health System) initiated after it implemented a PACS in November 2003 (Stockman & Krishnan, 2006). The anticipated benefits characteristic to PACS through the use of SPC include: • Reduced image retake and diagnostic expenditure associated with better process control. • Reduced operating costs by optimizing the maintenance and replacement of PACS equipment components. • Increased productivity by identification and elimination of variation and outof- control conditions in the imaging and retrieval processes. • Enhanced level of quality by controlled applications. SPC involves using statistical techniques to measure and analyze the variation in processes. Most often used for manufacturing processes, the intent of SPC is to monitor product quality and maintain processes to fixed targets. Hence besides the HSSH techniques, the proposed TQM approach would include the use of SPC. Although SPC will not improve the reliability of a poorly designed PACS, it can be used to maintain the consistency of how the individual process is provided and, therefore, of the entire PACS process. A primary tool used for SPC is the control chart, a graphical representation of certain descriptive statistics for specific quantitative measurements of the PACS process. These descriptive statistics are displayed in the control chart in comparison to their “in-control” sampling distributions. The comparison detects any unusual variation in the PACS delivery process, which could indicate a problem with the process. Several different descriptive statistics can be used in control charts and there are several different types of control charts that can test for different causes, such as how quickly major vs. minor shifts in process means are detected. These control charts are also used with service level measurements to analyze process capability and for continuous process improvement efforts.


Sign in / Sign up

Export Citation Format

Share Document