Demerit-Fuzzy Rating Mechanism and Monitoring Chart

Author(s):  
Ming-Hung Shu ◽  
Jan-Yee Kung ◽  
Bi-Min Hsu

The relative magnitude of weights for defects has a substantial impact on the performance of attribute control charts. Apparently, the current demerit-chart approach is superior than the c-chart scheme, because it imposes different precise-weights on distinct types of nonconformities, enabling more severe defects to disclose the problems existing in the manufacturing or service processes. However, this crisp-weighting defect assignment, assuming defects are of equal degree of severity when classified into the same defect class, may be so subjective that it leads to the chart somewhat restricted in widespread applications. Since in many cases the severity of each defect is evaluated from practitioners' visual inspection on the key quality characteristics of products or services, when each defect is classified into one of several mutually-exclusive linguistic classes, a fuzzy-weighting defect assignment that represents a degree of seriousness of defects should be allotted in accordance. Therefore, in this paper a demerit-fuzzy rating mechanism and monitoring chart is proposed. We first incorporate a fuzzy-linguistic weight in response to the severe degree of defects. Then, we apply the resolution identity property in construction of fuzzy control limits, and further develop a new fuzzy ranking method in differentiation of the underlying process condition. Finally, the proposed fuzzy-demerit chart is elucidated by an application of TFT-LCD manufacturing processes for monitoring their LCD Mura-nonconformities conditions.

Complexity ◽  
2017 ◽  
Vol 2017 ◽  
pp. 1-17 ◽  
Author(s):  
Ming-Hung Shu ◽  
Dinh-Chien Dang ◽  
Thanh-Lam Nguyen ◽  
Bi-Min Hsu ◽  
Ngoc-Son Phan

For sequentially monitoring and controlling average and variability of an online manufacturing process, x¯ and s control charts are widely utilized tools, whose constructions require the data to be real (precise) numbers. However, many quality characteristics in practice, such as surface roughness of optical lenses, have been long recorded as fuzzy data, in which the traditional x¯ and s charts have manifested some inaccessibility. Therefore, for well accommodating this fuzzy-data domain, this paper integrates fuzzy set theories to establish the fuzzy charts under a general variable-sample-size condition. First, the resolution-identity principle is exerted to erect the sample-statistics’ and control-limits’ fuzzy numbers (SSFNs and CLFNs), where the sample fuzzy data are unified and aggregated through statistical and nonlinear-programming manipulations. Then, the fuzzy-number ranking approach based on left and right integral index is brought to differentiate magnitude of fuzzy numbers and compare SSFNs and CLFNs pairwise. Thirdly, the fuzzy-logic alike reasoning is enacted to categorize process conditions with intermittent classifications between in control and out of control. Finally, a realistic example to control surface roughness on the turning process in producing optical lenses is illustrated to demonstrate their data-adaptability and human-acceptance of those integrated methodologies under fuzzy-data environments.


2020 ◽  
Vol 2020 ◽  
pp. 1-9
Author(s):  
Johnson A. Adewara ◽  
Kayode S. Adekeye ◽  
Olubisi L. Aako

In this paper, two methods of control chart were proposed to monitor the process based on the two-parameter Gompertz distribution. The proposed methods are the Gompertz Shewhart approach and Gompertz skewness correction method. A simulation study was conducted to compare the performance of the proposed chart with that of the skewness correction approach for various sample sizes. Furthermore, real-life data on thickness of paint on refrigerators which are nonnormal data that have attributes of a Gompertz distribution were used to illustrate the proposed control chart. The coverage probability (CP), control limit interval (CLI), and average run length (ARL) were used to measure the performance of the two methods. It was found that the Gompertz exact method where the control limits are calculated through the percentiles of the underline distribution has the highest coverage probability, while the Gompertz Shewhart approach and Gompertz skewness correction method have the least CLI and ARL. Hence, the two-parameter Gompertz-based methods would detect out-of-control faster for Gompertz-based X¯ charts.


2020 ◽  
Author(s):  
Alexis Oliva ◽  
Matías Llabrés

Different control charts in combination with the process capability indices, Cp, Cpm and Cpk, as part of the control strategy, were evaluated, since both are key elements in determining whether the method or process is reliable for its purpose. All these aspects were analyzed using real data from unitary processes and analytical methods. The traditional x-chart and moving range chart confirmed both analytical method and process are in control and stable and therefore, the process capability indices can be computed. We applied different criteria to establish the specification limits (i.e., analyst/customer requirements) for fixed method or process performance (i.e., process or method requirements). The unitary process does not satisfy the minimum capability requirements for Cp and Cpk indices when the specification limit and control limits are equal in breath. Therefore, the process needs to be revised; especially, a greater control in the process variation is necessary. For the analytical method, the Cpm and Cpk indices were computed. The obtained results were similar in both cases. For example, if the specification limits are set at ±3% of the target value, the method is considered “satisfactory” (1.22<Cpm<1.50) and no further stringent precision control is required.


2020 ◽  
Vol 1 (1) ◽  
pp. 9-16
Author(s):  
O. L. Aako ◽  
J. A. Adewara ◽  
K. S Adekeye ◽  
E. B. Nkemnole

The fundamental assumption of variable control charts is that the data are normally distributed and spread randomly about the mean. Process data are not always normally distributed, hence there is need to set up appropriate control charts that gives accurate control limits to monitor processes that are skewed. In this study Shewhart-type control charts for monitoring positively skewed data that are assumed to be from Marshall-Olkin Inverse Loglogistic Distribution (MOILLD) was developed. Average Run Length (ARL) and Control Limits Interval (CLI) were adopted to assess the stability and performance of the MOILLD control chart. The results obtained were compared with Classical Shewhart (CS) and Skewness Correction (SC) control charts using the ARL and CLI. It was discovered that the control charts based on MOILLD performed better and are more stable compare to CS and SC control charts. It is therefore recommended that for positively skewed data, a Marshall-Olkin Inverse Loglogistic Distribution based control chart will be more appropriate.


2018 ◽  
Vol 17 (1) ◽  
Author(s):  
Darmanto Darmanto

<p><em>The manufacturing production process that is currently trend is short-run. Short-run process is a job shop and a just in-time. These causes the process parameters to be unknown due to unavailability of data and generally a small amount of product. The control chart is one of the control charts which  designed for the short run. The procedure of the control chart follows the concept of succesive difference and under the assumption of the multivariate Normal distribution. The sensitivity level of a control chart is evaluated based on the average run length (ARL) value. In this study, the ARL value was calculated based on the shift simulation of the average vector by recording the first m-point out of the control limits. The average vector shift simulation of the target () is performed simultaneously with the properties of a positive shift (=+ δ). Variations of data size and many variables in this study were m = 20, 50 and p = 2, 4, 8, respectively. Each scheme (a combination of δ, m and p) is iterated 250,000 times. The simulation results show that for all schemes when both parameters are known ARL<sub>0 </sub>≈ 370. But, when parameters are unknown, ARL<sub>1</sub> turn to smaller. This conclusion also implied when the number of p and n are increased, it reduce the sensitivity of the control chart.</em></p>


Mathematics ◽  
2020 ◽  
Vol 8 (5) ◽  
pp. 857 ◽  
Author(s):  
Ishaq Adeyanju Raji ◽  
Muhammad Hisyam Lee ◽  
Muhammad Riaz ◽  
Mu’azu Ramat Abujiya ◽  
Nasir Abbas

Shewhart control charts with estimated control limits are widely used in practice. However, the estimated control limits are often affected by phase-I estimation errors. These estimation errors arise due to variation in the practitioner’s choice of sample size as well as the presence of outlying errors in phase-I. The unnecessary variation, due to outlying errors, disturbs the control limits implying a less efficient control chart in phase-II. In this study, we propose models based on Tukey and median absolute deviation outlier detectors for detecting the errors in phase-I. These two outlier detection models are as efficient and robust as they are distribution free. Using the Monte-Carlo simulation method, we study the estimation effect via the proposed outlier detection models on the Shewhart chart in the normal as well as non-normal environments. The performance evaluation is done through studying the run length properties namely average run length and standard deviation run length. The findings of the study show that the proposed design structures are more stable in the presence of outlier detectors and require less phase-I observation to stabilize the run-length properties. Finally, we implement the findings of the current study in the semiconductor manufacturing industry, where a real dataset is extracted from a photolithography process.


2011 ◽  
Vol 112 (3) ◽  
pp. 736-737 ◽  
Author(s):  
Karthik Raghunathan ◽  
Hani Al-Najjar ◽  
Adam Snavely

1986 ◽  
Vol 108 (3) ◽  
pp. 219-226 ◽  
Author(s):  
B. D. Notohardjono ◽  
D. S. Ermer

This paper discusses the development of control charts for correlated and contaminated data. For illustration the charts were applied to a set of maximum principal-stress data at two locations on a blast furnace shell. The Dynamic Data System (DDS) approach was used to model the correlated data which contained several types of discrepancies. After the standard DDS models were found, control charts for the averages and variances of the model residuals were constructed for two data sets. For more effective analysis, two methods for calculating the control limits for both charts are given. With this approach, dynamic process change, such as an increase in the production rate or the wearing out of the sacrificial lining, can be detected and separated from data with collection errors from instrument malfunctions. Furthermore, the tap hole opening timing is identified from the DDS model parameters, to help verify the time series model.


Sign in / Sign up

Export Citation Format

Share Document