Simplifying the Six Sigma Toolbox through Application of Shainin DOE Techniques

2009 ◽  
Vol 34 (1) ◽  
pp. 13-30 ◽  
Author(s):  
Sunil Sharma ◽  
Anuradha R Chetiya

The success of a Six Sigma programme in an organization depends to a large extent on the success of the Six Sigma projects, which in turn depends on how the team handles the problem and whether the right combination of tools is being applied to address the root cause. The Six Sigma toolbox consists of a wide range of tools comprising, on the one hand, simple and commonly used tools like flow charts, Pareto analysis, and cause-and-effect diagram and the more advanced statistical tools like design of experiments, regression analysis and many more, on the other hand. While the simple tools are easy to apply, understand, and analyse, engineers perceive the more advanced tools difficult to comprehend. Design of experiments (DOE) is one such tool. Two well-known approaches of design of experiments are the Classical DOE, pioneered by Sir Ronald A Fisher and the Taguchi approach, pioneered by Dr Genichii Taguchi. A third approach to experimental design—the Shainin DOE techniques, offered by Dr Dorian Shainin—can be considered as a very good alternative to the other approaches. They are much simpler than the factorial designs, response surface designs, and orthogonal arrays of the conventional approaches of DOE, but at the same time are recognized as being very powerful and effective in solving the chronic quality problems that plague most manufacturers. Shainin DOE basically works at eliminating suspected process variables by mostly using seven different tools, viz., Multi-Vari Charts Component Search Paired Comparison Variable Search Full Factorials B vs. C (Better vs. Current) Analysis Scatter Plots or Realistic Tolerance Parallelogram Plots. Though not very well documented, these tools have proved to be the key drivers in the success of many companies, e.g., Motorola. This article examines two projects of a leading automotive and general lighting lamp manufacturing company, in which a combination of the standard Six Sigma tools and Shainin tools has been successfully used to address the root cause of the problems. The advantage of using Shainin tools is that: Very small sample sizes are required to analyse the problem. Often samples as small as 2 or 3 are enough to make statistically valid conclusions. Statistical software is not required to analyse the data. In fact, Shainin DOE does not even require knowledge of complex statistical tools. It involves employees at all levels, including workers and junior staff in problem solving that was hitherto a domain of senior technical experts. Also, the success of the projects had a very positive effect on the morale of the employees in terms of convincing them that Six Sigma is not all about using complex statistical tools.

Author(s):  
Carmen Huerga Castro ◽  
Julio Ignacio Abad González ◽  
Pilar Blanco Alonso

<p>La metodología seis sigma es un programa de mejora continua de la calidad que, en base a hechos y datos, persigue reducir errores y avanzar hacia altos objetivos de calidad. Ofrece un enfoque estructurado, analítico y racional para el establecimiento de proyectos de mejora acordes con los objetivos planteados. Si bien la popularidad del seis sigma se deriva de su aplicación en los procesos productivos del sector industrial, cada vez está más extendida su aplicación en el sector servicios y, por ende, en los servicios sanitarios donde la “satisfacción del cliente” adquiere una relevancia vital.<br />La aplicación de seis sigma requiere el uso de un amplio abanico de herramientas estadísticas, de hecho, el término sigma representa la desviación típica de una distribución y es el factor clave para conocer la variabilidad de la misma. Por ello, en este trabajo señalamos las herramientas más apropiadas en cada etapa o fase de implementación del seis sigma (definir, medir, analizar, mejorar y controlar) y presentamos una propuesta de aplicación en un servicio sanitario.</p><p>Six-Sigma is a strategy for continuous quality improvement based on facts and data that attempts to reach higher quality standards and lower number of defects. Six-Sigma provides a structured, analytic and rational approach that allows the implementation of quality improvement projects according to the planned objectives. Although its current popularity is mainly due to its widespread implementation in the industrial sector, it is also being increasingly used in the services sector, such as in health care services, where the customer’s satisfaction has an even more crucial relevance.<br />Six-Sigma involves the use of a wide range of statistical tools; in fact, the term <em>sigma</em> means standard deviation, which is a key measure of the distribution’s variability. In this paper the statistical tools more suitable for each phase of Six-Sigma’s adoption are presented as well a proposal of its adoption in health care services</p>


Author(s):  
Carmen Huerga Castro ◽  
Julio Ignacio Abad González ◽  
Pilar Blanco Alonso

<p>La metodología seis sigma es un programa de mejora continua de la calidad que, en base a hechos y datos, persigue reducir errores y avanzar hacia altos objetivos de calidad. Ofrece un enfoque estructurado, analítico y racional para el establecimiento de proyectos de mejora acordes con los objetivos planteados. Si bien la popularidad del seis sigma se deriva de su aplicación en los procesos productivos del sector industrial, cada vez está más extendida su aplicación en el sector servicios y, por ende, en los servicios sanitarios donde la “satisfacción del cliente” adquiere una relevancia vital.<br />La aplicación de seis sigma requiere el uso de un amplio abanico de herramientas estadísticas, de hecho, el término sigma representa la desviación típica de una distribución y es el factor clave para conocer la variabilidad de la misma. Por ello, en este trabajo señalamos las herramientas más apropiadas en cada etapa o fase de implementación del seis sigma (definir, medir, analizar, mejorar y controlar) y presentamos una propuesta de aplicación en un servicio sanitario.</p><p>Six-Sigma is a strategy for continuous quality improvement based on facts and data that attempts to reach higher quality standards and lower number of defects. Six-Sigma provides a structured, analytic and rational approach that allows the implementation of quality improvement projects according to the planned objectives. Although its current popularity is mainly due to its widespread implementation in the industrial sector, it is also being increasingly used in the services sector, such as in health care services, where the customer’s satisfaction has an even more crucial relevance.<br />Six-Sigma involves the use of a wide range of statistical tools; in fact, the term <em>sigma</em> means standard deviation, which is a key measure of the distribution’s variability. In this paper the statistical tools more suitable for each phase of Six-Sigma’s adoption are presented as well a proposal of its adoption in health care services</p>


A. C. Atkinson ( Imperial College ,London, U.K. ). A major theme of this meeting has been the necessity of considering the joint action of several factors. Experimental methods in which one factor is changed at a time have been shown to fail because of the frequent occurrence of important interactions between factors. It is therefore particularly disturbing that Dr Ballard’s paper on reliability in nuclear plants is confined to consideration of the failure of components in isolation. Since the failure of one component can drastically change the environment in which the other components work, one failure may trigger a chain of failures. The accident at Three Mile Island illustrates this. Of course, the compound probability of failure is still found by multiplying probabilities together, but these are conditional rather than unconditional, and they can be very different. Will Dr Ballard please comment? I now turn to direct consideration of the design of experiments. The basic statistical tools for the design of multifactor experiments with quantitative variables have been available for 30 years. One continuing development since then has been the increasing use of computers in both design and analysis.


2019 ◽  
pp. 40-46 ◽  
Author(s):  
V.V. Savchenko ◽  
A.V. Savchenko

We consider the task of automated quality control of sound recordings containing voice samples of individuals. It is shown that in this task the most acute is the small sample size. In order to overcome this problem, we propose the novel method of acoustic measurements based on relative stability of the pitch frequency within a voice sample of short duration. An example of its practical implementation using aninter-periodic accumulation of a speech signal is considered. An experimental study with specially developed software provides statistical estimates of the effectiveness of the proposed method in noisy environments. It is shown that this method rejects the audio recording as unsuitable for a voice biometric identification with a probability of 0,95 or more for a signal to noise ratio below 15 dB. The obtained results are intended for use in the development of new and modifying existing systems of collecting and automated quality control of biometric personal data. The article is intended for a wide range of specialists in the field of acoustic measurements and digital processing of speech signals, as well as for practitioners who organize the work of authorized organizations in preparing for registration samples of biometric personal data.


2020 ◽  
pp. 1192-1198
Author(s):  
M.S. Mohammad ◽  
Tibebe Tesfaye ◽  
Kim Ki-Seong

Ultrasonic thickness gauges are easy to operate and reliable, and can be used to measure a wide range of thicknesses and inspect all engineering materials. Supplementing the simple ultrasonic thickness gauges that present results in either a digital readout or as an A-scan with systems that enable correlating the measured values to their positions on the inspected surface to produce a two-dimensional (2D) thickness representation can extend their benefits and provide a cost-effective alternative to expensive advanced C-scan machines. In previous work, the authors introduced a system for the positioning and mapping of the values measured by the ultrasonic thickness gauges and flaw detectors (Tesfaye et al. 2019). The system is an alternative to the systems that use mechanical scanners, encoders, and sophisticated UT machines. It used a camera to record the probe’s movement and a projected laser grid obtained by a laser pattern generator to locate the probe on the inspected surface. In this paper, a novel system is proposed to be applied to flat surfaces, in addition to overcoming the other limitations posed due to the use of the laser projection. The proposed system uses two video cameras, one to monitor the probe’s movement on the inspected surface and the other to capture the corresponding digital readout of the thickness gauge. The acquired images of the probe’s position and thickness gauge readout are processed to plot the measured data in a 2D color-coded map. The system is meant to be simpler and more effective than the previous development.


2020 ◽  
Vol 24 ◽  
Author(s):  
Bubun Banerjee ◽  
Gurpreet Kaur ◽  
Navdeep Kaur

: Metal-free organocatalysts are becoming an important tool for the sustainable developments of various bioactive heterocycles. On the other hand, during last two decades, calix[n]arenes have been gaining considerable attention due to their wide range of applicability in the field of supramolecular chemistry. Recently, sulfonic acid functionalized calix[n] arenes are being employed as an efficient alternative catalyst for the synthesis of various bioactive scaffolds. In this review we have summarized the catalytic efficiency of p-sulfonic acid calix[n]arenes for the synthesis of diverse biologically promising scaffolds under various reaction conditions. There is no such review available in the literature showing the catalytic applicability of p-sulfonic acid calix[n]arenes. Therefore, we strongly believe that this review will surely attract those researchers who are interested about this fascinating organocatalyst.


Author(s):  
Rosalia Gonzales ◽  
Travis Mathewson ◽  
Jefferson Chin ◽  
Holly McKeith ◽  
Lane Milde ◽  
...  

Since the advent of modern-day screening collections in the early 2000s, various aspects of our knowledge of good handling practices have continued to evolve. Some early practices, however, continue to prevail due to the absence of defining data that would bust the myths of tradition. The lack of defining data leads to a gap between plate-based screeners, on the one hand, and compound sample handling groups, on the other, with the latter being the default party to blame when an assay goes awry. In this paper, we highlight recommended practices that ensure sample integrity and present myth busting data that can help determine the root cause of an assay gone bad. We show how a strong and collaborative relationship between screening and sample handling groups is the better state that leads to the accomplishment of the common goal of finding breakthrough medicines.


2021 ◽  
Vol 15 (5) ◽  
pp. 1-32
Author(s):  
Quang-huy Duong ◽  
Heri Ramampiaro ◽  
Kjetil Nørvåg ◽  
Thu-lan Dam

Dense subregion (subgraph & subtensor) detection is a well-studied area, with a wide range of applications, and numerous efficient approaches and algorithms have been proposed. Approximation approaches are commonly used for detecting dense subregions due to the complexity of the exact methods. Existing algorithms are generally efficient for dense subtensor and subgraph detection, and can perform well in many applications. However, most of the existing works utilize the state-or-the-art greedy 2-approximation algorithm to capably provide solutions with a loose theoretical density guarantee. The main drawback of most of these algorithms is that they can estimate only one subtensor, or subgraph, at a time, with a low guarantee on its density. While some methods can, on the other hand, estimate multiple subtensors, they can give a guarantee on the density with respect to the input tensor for the first estimated subsensor only. We address these drawbacks by providing both theoretical and practical solution for estimating multiple dense subtensors in tensor data and giving a higher lower bound of the density. In particular, we guarantee and prove a higher bound of the lower-bound density of the estimated subgraph and subtensors. We also propose a novel approach to show that there are multiple dense subtensors with a guarantee on its density that is greater than the lower bound used in the state-of-the-art algorithms. We evaluate our approach with extensive experiments on several real-world datasets, which demonstrates its efficiency and feasibility.


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1461
Author(s):  
Shun-Hsin Yu ◽  
Jen-Shuo Chang ◽  
Chia-Hung Dylan Tsai

This paper proposes an object classification method using a flexion glove and machine learning. The classification is performed based on the information obtained from a single grasp on a target object. The flexion glove is developed with five flex sensors mounted on five finger sleeves, and is used for measuring the flexion of individual fingers while grasping an object. Flexion signals are divided into three phases, and they are the phases of picking, holding and releasing, respectively. Grasping features are extracted from the phase of holding for training the support vector machine. Two sets of objects are prepared for the classification test. One is printed-object set and the other is daily-life object set. The printed-object set is for investigating the patterns of grasping with specified shape and size, while the daily-life object set includes nine objects randomly chosen from daily life for demonstrating that the proposed method can be used to identify a wide range of objects. According to the results, the accuracy of the classifications are achieved 95.56% and 88.89% for the sets of printed objects and daily-life objects, respectively. A flexion glove which can perform object classification is successfully developed in this work and is aimed at potential grasp-to-see applications, such as visual impairment aid and recognition in dark space.


Religions ◽  
2019 ◽  
Vol 10 (6) ◽  
pp. 389
Author(s):  
James Robert Brown

Religious notions have long played a role in epistemology. Theological thought experiments, in particular, have been effective in a wide range of situations in the sciences. Some of these are merely picturesque, others have been heuristically important, and still others, as I will argue, have played a role that could be called essential. I will illustrate the difference between heuristic and essential with two examples. One of these stems from the Newton–Leibniz debate over the nature of space and time; the other is a thought experiment of my own constructed with the aim of making a case for a more liberal view of evidence in mathematics.


Sign in / Sign up

Export Citation Format

Share Document