Real-time pattern recognition in statistical process control: A hybrid neural network/decision tree-based approach

Author(s):  
Ruey-Shiang Guh

Pattern recognition is an important issue in statistical process control because unnatural patterns displayed by control charts can be associated with specific causes that adversely impact on the manufacturing process. A common problem of existing approaches to control chart pattern (CCP) recognition is false classification between different types of CCP that share similar features in a real-time process-monitoring scenario, in which only limited pattern points are available for recognition. This study proposes a hybrid learning-based system that integrates neural networks and decision tree learning to overcome the classification problem in a real-time CCP recognition scheme. This hybrid system consists of three sequential modules, namely feature extraction, coarse classification, and fine classification. The coarse-classification model employs a four-layer back propagation network to detect and classify unnatural CCPs. The fine-classification module contains four decision trees used in a simple heuristic algorithm for further classifying the detected CCPs. Simulation experiments demonstrate that the false recognition problem has been effectively addressed by the proposed hybrid system. Compared with conventional control chart approaches, the proposed system has better performance in terms of recognition speed and also can accurately identify the type of unnatural CCP. Although a real-time CCP recognizer for the individual's (X) chart is the specific application presented here, the proposed hybrid methodology based on neural networks and decision trees can be applied to other control charts.

Author(s):  
Arya Nugraha ◽  
◽  
Gatot Yudoko

As the frequency, severity, and costs of safety risks continue to become a challenge for mining industry, the company understood that the existing safety analytic does not provide adequate information, as it has been relying predominantly on collecting and evaluating aggregated data of lagging indicators about past accidents. This method has been negatively driving the organization to carry out repetitive cycle of accident analysis and problem solving, and therefore, undertaking reactive responses. This paper investigated how statistical process control, in particular control charts, can be applied to hazards data, as the leading indicator of accidents, to detect statistically trends in safety process and safety behavior, aiming to control the safety process in real-time manner before the occurrence of accidents. The result showed that the latest iteration of control limits development in Phase 3 is suitable as the control chart for safety process in one of case study mine operation site. Furthermore, the implementation of control charts to hazards data not only it helps the organization to transition its safety analytic to leading indicator analysis, it enables the organization to control safety process in real-time practice and to carry out timely safety intervention long before the potential occurrence of severe accidents, in which within this case, the first early warning signal was triggered 49 days before the occurrence of the fatality accident.


Author(s):  
Mario Lesina ◽  
Lovorka Gotal Dmitrovic

The paper shows the relation among the number of small, medium and large companies in the leather and footwear industry in Croatia, as well as the relation among the number of their employees by means of the Spearman and Pearson correlation coefficient. The data were collected during 21 years. The warning zone and the risk zone were determined by means of the Statistical Process Control (SPC) for a certain number of small, medium and large companies in the leather and footwear industry in Croatia. Growth models, based on externalities, models based on research and development and the AK models were applied for the analysis of the obtained research results. The paper shows using the correlation coefficients that The relation between the number of large companies and their number of employees is the strongest, i.e. large companies have the best structured work places. The relation between the number of medium companies and the number of their employees is a bit weaker, while there is no relation in small companies. This is best described by growth models based on externalities, in which growth generates the increase in human capital, i.e. the growth of the level of knowledge and skills in the entire economy, but also deductively in companies on microeconomic level. These models also recognize the limit of accumulated knowledge after which growth may be expected. The absence of growth in small companies results from an insufficient level of human capital and failure to reach its limit level which could generate growth. According to Statistical Process Control (SPC), control charts, as well as regression models, it is clear that the most cost-effective investment is the investment into medium companies. The paper demonstrates the disadvantages in small, medium and large companies in the leather and footwear industry in Croatia. Small companies often emerge too quickly and disappear too easily owing to the employment of administrative staff instead of professional production staff. As the models emphasize, companies need to invest into their employees and employ good production staff. Investment and support to the medium companies not only strengthens the companies which have a well-arranged technological process and a good systematization of work places, but this also helps large companies, as there is a strong correlation between the number of medium and large companies.


2020 ◽  
Vol 79 (Suppl 1) ◽  
pp. 1855.2-1855
Author(s):  
M. Stevens ◽  
N. Proudlove ◽  
J. Ball ◽  
C. Scott

Background:Pathology test turnaround times (TATs) are a limiting factor in patient flow through rheumatology services. Quality improvement (QI) methodologies such as Lean use tools including statistical process control (SPC) and process mapping to study the performance of the whole of a clinical pipeline, expose unnecessary complexity (non-value-adding activity), and streamline processes and staff roles.Objectives:Understand effects of changes made to CTD testing algorithm over last 12 years by measuring some of the effects on TATs. Model current processes and suggest changes to workflow to improve TAT.Methods:High-level flow diagrams of the current testing algorithm, and low-level process maps of analyser and staff processes were drawn.Activity and TATs (working days between report and booking date) for ANA, ENA, DNA and CCP tests were plotted as XmR control charts.Results:Finding 1: Largest referral laboratory does not currently operate a separate DNA monitoring workstream, resulting in unnecessary ANA and ENA testing (figure 1).Figure 1.Current testing strategy (left) and suggested improvement (right)Finding 2:Samples are handed off between 3 different lab benches, each of which may be staffed by a different staff member on a different day, and results processing involves handoff to a further 2 different staff members.Finding 3:ANA demand is close to capacity, ENA demand exceeds current capacity (table 1).Table 1.Demand for ANA, ENA and DNA tests, compared to capacityTestMedian Demand(tests/ day)Approx. Capacity(tests/ day)NotesANA74100Close to 80% recommended by the ILGsENA3836*Less capacity than demand!!DNA34100PlentyFinding 4:Stopping screening DNA requests on ANA result increased the number of DNA tests performed by about 10 samples per day (30%), but decreased turnaround time by a similar proportion (3.3 to 2.3 days, figure 2). It also reduced turnaround times of ANA and ENA tests.Figure 2.Control chart of average TAT of dsDNA antibodies by request dateConclusion:Typically for a QI project, the initially simple CTD testing pipeline has accumulated many changes made without consideration of whole system performance, and is now a struggle to run.Improvement ideas to be explored from this work include:Liaising with main referral lab to develop a DNA monitoring workstream to reduce unnecessary ANA and ENA testingReduce handoffs, sample journey around lab analysers, and staff hands-on time by:changing ANA test methodology to same as DNAcreating new staff roles (analyser operators to perform validation/ authorisation steps)Create more capacity for ENA testing by increasing the frequency of this test on the weekly rotaCreate more capacity for service expansion by running analysers at weekends (staff consultation required)Reduce demand on service by engaging and educating requestorsImprove TAT for DNA by:processing samples the day they are booked in, instead of 1 day laterauto-validating runs…using control charts to measure improvementDisclosure of Interests:None declared


2013 ◽  
Vol 845 ◽  
pp. 696-700
Author(s):  
Razieh Haghighati ◽  
Adnan Hassan

Traditional statistical process control (SPC) charting techniques were developed to monitor process status and helping identify assignable causes. Unnatural patterns in the process are recognized by means of control chart pattern recognition (CCPR) techniques. There are a broad set of studies in CCPR domain, however, given the growing doubts concerning the performance of control charts in presence of constrained data, this area has been overlooked in the literature. This paper, reports a preliminary work to develop a scheme for fault tolerant CCPR that is capable of (i) detecting of constrained data that is sampled in a misaligned uneven fashion and/or be partly lost or unavailable and (ii) accommodating the system in order to improve the reliability of recognition.


Sign in / Sign up

Export Citation Format

Share Document