scholarly journals Economic-statistical design of variable parameters non-central chi-square control chart

Production ◽  
2011 ◽  
Vol 21 (2) ◽  
pp. 259-270 ◽  
Author(s):  
Maysa Sacramento de Magalhães ◽  
Francisco Duarte Moura Neto

Production processes are monitored by control charts since their inception by Shewhart (1924). This surveillance is useful in improving the production process due to increased stabilization of the process, and consequently standardization of the output. Control charts keep track of a few key quality characteristics of the outcome of the production process. This is done by means of univariate or multivariate charts. Small improvements in control chart methodology can have significant economic impact in the production process. In this investigation, we propose the monitoring of a single variable by means of a variable parameter non-central chi-square control chart. The design of the chart is accomplished by means of optimizing a cost function. We use here a simulated annealing optimization tool, due to the difficulty of classical gradient based optimization techniques to handle the optimization of the cost function. The results show some of the drawbacks of using this model.

2018 ◽  
Vol 7 (4) ◽  
pp. 385-396
Author(s):  
Dwi Harti Pujiana ◽  
Mustafid Mustafid ◽  
Di Asih I Maruddani

Denim fabric sort number 78032 is one type of fabric in the last 4 years almost every month produced by PT Apac Inti Corpora. In the continuity of denim fabric production process, there are data defects (non-conformity) that causes the quality of denim fabric decreases. To maintain the consistency of the quality of products produced in accordance with the specified specifications, it is necessary to control the quality of the production process that has been running for this. Multivariate control charts attributes used are multivariate control charts np using the number of samples and the proportion of disability data with correlation between variables while the chi-square distance control charts use squared distances with uncorrelated data between variables. The results showed that in the multivariate control chart np there were 2 out-of-control observations in the phase II data using control limits from phase I data already controlled by the value of BKA of 636321.4. While in the chi-square distance control chart showed all observations are in in-control condition with BKA value of 0.06536. Controlled production process obtained multivariate process capability value  for multivariate control np diagram of 0.625142 <1 which means the process is not capable, while the value of process capability in the chi-square distance control chart is 1.1329> 1 which means the process is capable. Keywords: denim fabric, multivariate np control chart, chi-square distance control chart, multivariate process capability


2014 ◽  
Vol 2014 ◽  
pp. 1-20 ◽  
Author(s):  
Santiago-Omar Caballero-Morales

Shewhart or control charts are important Statistical Process Control (SPC) techniques used for prompt detection of failures in a manufacturing process and minimization of production costs which are modelled with nonlinear functions (cost functions). Heuristic methods have been used to find the chart’s parameters integrated within the cost function that best comply with economic and statistical restrictions. However heuristic estimation is highly dependent on the size of the search space, the set of initial solutions, and the exploration operators. In this paper the 3D analysis of the cost function is presented to more accurately identify the search space associated with each parameter ofX¯control charts and to improve estimation. The parameters estimated with this approach were more accurate than those estimated with Hooke and Jeeves (HJ) and Genetic Algorithms (GAs) under different failure distributions. The results presented in this work can be used as a benchmark to evaluate and improve the performance of other heuristic methods.


2020 ◽  
Vol 30 (6) ◽  
pp. 1645-1663
Author(s):  
Ömer Deniz Akyildiz ◽  
Dan Crisan ◽  
Joaquín Míguez

Abstract We introduce and analyze a parallel sequential Monte Carlo methodology for the numerical solution of optimization problems that involve the minimization of a cost function that consists of the sum of many individual components. The proposed scheme is a stochastic zeroth-order optimization algorithm which demands only the capability to evaluate small subsets of components of the cost function. It can be depicted as a bank of samplers that generate particle approximations of several sequences of probability measures. These measures are constructed in such a way that they have associated probability density functions whose global maxima coincide with the global minima of the original cost function. The algorithm selects the best performing sampler and uses it to approximate a global minimum of the cost function. We prove analytically that the resulting estimator converges to a global minimum of the cost function almost surely and provide explicit convergence rates in terms of the number of generated Monte Carlo samples and the dimension of the search space. We show, by way of numerical examples, that the algorithm can tackle cost functions with multiple minima or with broad “flat” regions which are hard to minimize using gradient-based techniques.


2021 ◽  
Vol 2106 (1) ◽  
pp. 012019
Author(s):  
M Qori’atunnadyah ◽  
Wibawati ◽  
W M Udiatami ◽  
M Ahsan ◽  
H Khusna

Abstract In recent years, the manufacturing industry has tended to reduce mass production and produce in small quantities, which is called “Short Run Production”. In such a situation, the course of the production process is short, usually, the number of productions is less than 50. Therefore, a control chart for the short run production process is required. This paper discusses the comparison between multivariate control chart for short run production (V control chart) and T2 Hotelling control chart applied to sunergy glass data. Furthermore, a simulation of Average Run Length (ARL) was carried out to determine the performance of the two control charts. The results obtained are that the production process has not been statistically controlled using either the V control chart or the T2 Hotelling control chart. The number of out-of-control on the control chart V using the the EWMA test is more than the T2 Hotelling control chart. Based on the ARL value, it shows that the V control chart is more sensitive than the T2 Hotelling control chart.


2019 ◽  
Vol 36 (4) ◽  
pp. 526-551 ◽  
Author(s):  
Mohammad Hosein Nadreri ◽  
Mohamad Bameni Moghadam ◽  
Asghar Seif

PurposeThe purpose of this paper is to develop an economic statistical design based on the concepts of adjusted average time to signal (AATS) andANFforX¯control chart under a Weibull shock model with multiple assignable causes.Design/methodology/approachThe design used in this study is based on a multiple assignable causes cost model. The new proposed cost model is compared with the same cost and time parameters and optimal design parameters under uniform and non-uniform sampling schemes.FindingsNumerical results indicate that the cost model with non-uniform sampling cost has a lower cost than that with uniform sampling. By using sensitivity analysis, the effect of changing fixed and variable parameters of time, cost and Weibull distribution parameters on the optimum values of design parameters and loss cost is examined and discussed.Practical implicationsThis research adds to the body of knowledge relating to the quality control of process monitoring systems. This paper may be of particular interest to practitioners of quality systems in factories where multiple assignable causes affect the production process.Originality/valueThe cost functions for uniform and non-uniform sampling schemes are presented based on multiple assignable causes withAATSandANFconcepts for the first time.


2018 ◽  
Vol 11 (12) ◽  
pp. 4739-4754 ◽  
Author(s):  
Vladislav Bastrikov ◽  
Natasha MacBean ◽  
Cédric Bacour ◽  
Diego Santaren ◽  
Sylvain Kuppel ◽  
...  

Abstract. Land surface models (LSMs), which form the land component of earth system models, rely on numerous processes for describing carbon, water and energy budgets, often associated with highly uncertain parameters. Data assimilation (DA) is a useful approach for optimising the most critical parameters in order to improve model accuracy and refine future climate predictions. In this study, we compare two different DA methods for optimising the parameters of seven plant functional types (PFTs) of the ORCHIDEE LSM using daily averaged eddy-covariance observations of net ecosystem exchange and latent heat flux at 78 sites across the globe. We perform a technical investigation of two classes of minimisation methods – local gradient-based (the L-BFGS-B algorithm, limited memory Broyden–Fletcher–Goldfarb–Shanno algorithm with bound constraints) and global random search (the genetic algorithm) – by evaluating their relative performance in terms of the model–data fit and the difference in retrieved parameter values. We examine the performance of each method for two cases: when optimising parameters at each site independently (“single-site” approach) and when simultaneously optimising the model at all sites for a given PFT using a common set of parameters (“multi-site” approach). We find that for the single site case the random search algorithm results in lower values of the cost function (i.e. lower model–data root mean square differences) than the gradient-based method; the difference between the two methods is smaller for the multi-site optimisation due to a smoothing of the cost function shape with a greater number of observations. The spread of the cost function, when performing the same tests with 16 random first-guess parameters, is much larger with the gradient-based method, due to the higher likelihood of being trapped in local minima. When using pseudo-observation tests, the genetic algorithm results in a closer approximation of the true posterior parameter value in the L-BFGS-B algorithm. We demonstrate the advantages and challenges of different DA techniques and provide some advice on using it for the LSM parameter optimisation.


2013 ◽  
Vol 2013 ◽  
pp. 1-7 ◽  
Author(s):  
Young-Seok Choi

This paper presents a new approach of the normalized subband adaptive filter (NSAF) which directly exploits the sparsity condition of an underlying system for sparse system identification. The proposed NSAF integrates a weightedl1-norm constraint into the cost function of the NSAF algorithm. To get the optimum solution of the weightedl1-norm regularized cost function, a subgradient calculus is employed, resulting in a stochastic gradient based update recursion of the weightedl1-norm regularized NSAF. The choice of distinct weightedl1-norm regularization leads to two versions of thel1-norm regularized NSAF. Numerical results clearly indicate the superior convergence of thel1-norm regularized NSAFs over the classical NSAF especially when identifying a sparse system.


Sensors ◽  
2020 ◽  
Vol 20 (17) ◽  
pp. 4984
Author(s):  
Yajing Zou ◽  
Amr Eldemiry ◽  
Yaxin Li ◽  
Wu Chen

Three-dimensional (3D) reconstruction using RGB-D camera with simultaneous color image and depth information is attractive as it can significantly reduce the cost of equipment and time for data collection. Point feature is commonly used for aligning two RGB-D frames. Due to lacking reliable point features, RGB-D simultaneous localization and mapping (SLAM) is easy to fail in low textured scenes. To overcome the problem, this paper proposes a robust RGB-D SLAM system fusing both points and lines, because lines can provide robust geometry constraints when points are insufficient. To comprehensively fuse line constraints, we combine 2D and 3D line reprojection error with point reprojection error in a novel cost function. To solve the cost function and filter out wrong feature matches, we build a robust pose solver using the Gauss–Newton method and Chi-Square test. To correct the drift of camera poses, we maintain a sliding-window framework to update the keyframe poses and related features. We evaluate the proposed system on both public datasets and real-world experiments. It is demonstrated that it is comparable to or better than state-of-the-art methods in consideration with both accuracy and robustness.


2017 ◽  
Vol 32 (1) ◽  
Author(s):  
Azamsadat Iziy ◽  
Bahram Sadeghpour Gildeh ◽  
Ehsan Monabbati

AbstractControl charts have been established as major tools for quality control and improvement in industry. Therefore, it is always required to consider an appropriate design of a control chart from an economical point of view before using the chart. The economic design of a control chart refers to the determination of three optimal control chart parameters: sample size, the sampling interval, and the control limits coefficient. In this article, the double sampling (DS)


2014 ◽  
Vol 31 (9) ◽  
pp. 966-982 ◽  
Author(s):  
Shu Qing Liu ◽  
Qin Su ◽  
Ping Li

Purpose – In order to meet the requirements of 6σ management and to overcome the deficiencies of the theory for using the pre-control chart to evaluate and monitor quality stability, the purpose of this paper is to probe into the quality stability evaluation and monitoring guidelines of small batch production process based on the pre-control chart under the conditions of the distribution center and specifications center non-coincidence (0<ɛ≤1.5σ), the process capability index C p ≥2 and the virtual alarm probability α=0.27 percent. Design/methodology/approach – First, the range of the quality stability evaluation sampling number in initial production process is determined by using probability and statistics methods, the sample size for the quality stability evaluation is adjusted and determined in initial production process according to the error judgment probability theory, and the guideline for quality stability evaluation has been proposed in initial production process based on the theory of small probability events. Second, the alternative guidelines for quality stability monitoring and control in formal production process are proposed by using combination theory, the alternative guidelines are initially selected based on the theory of small probability events, a comparative analysis of the guidelines is made according to the average run lengths values, and the monitoring and control guidelines for quality stability are determined in formal production process. Findings – The results obtained from research indicate that when the virtual alarm probability α=0.27 percent, the shifts ɛ in the range 0<ɛ≤1.5σ and the process capability index C p ≥2, the quality stability evaluation sample size of the initial production process is 11, whose scondition is that the number of the samples falling into the yellow zone is 1 at maximum. The quality stability evaluation sample size of the formal production process is 5, and when the number of the samples falling into the yellow zone is ≤1, the process is stable, while when two of the five samples falling into the yellow, then one more sample needs to be added, and only if this sample falls into the green zone, the process is stable. Originality/value – Research results can overcome the unsatisfactory 6σ management assumptions and requirements and the oversize virtual alarm probability α of the past pre-control charts, as well as the shortage only adaptable to the pre-control chart when the shifts ɛ=0. And at the same time, the difficult problem hard to adopt the conventional control charts to carry out process control because of a fewer sample sizes is solved.


Sign in / Sign up

Export Citation Format

Share Document