scholarly journals Statistics Decomposition and Monitoring in Original Variable Space

Author(s):  
Jing Wang ◽  
Jinglin Zhou ◽  
Xiaolu Chen

AbstractThe traditional process monitoring method first projects the measured process data into the principle component subspace (PCS) and the residual subspace (RS), then calculates $$\mathrm T^2$$ T 2 and $$\mathrm SPE$$ S P E statistics to detect the abnormality. However, the abnormality by these two statistics are detected from the principle components of the process. Principle components actually have no specific physical meaning, and do not contribute directly to identify the fault variable and its root cause. Researchers have proposed many methods to identify the fault variable accurately based on the projection space. The most popular is contribution plot which measures the contribution of each process variable to the principal element (Wang et al. 2017; Luo et al. 2017; Liu and Chen 2014). Moreover, in order to determine the control limits of the two statistics, their probability distributions should be estimated or assumed as specific one. The fault identification by statistics is not intuitive enough to directly reflect the role and trend of each variable when the process changes.

Author(s):  
Robert Andrews ◽  
Fahame Emamjome ◽  
Arthur H.M. ter Hofstede ◽  
Hajo A. Reijers

2020 ◽  
Vol 1 (1) ◽  
pp. 9-16
Author(s):  
O. L. Aako ◽  
J. A. Adewara ◽  
K. S Adekeye ◽  
E. B. Nkemnole

The fundamental assumption of variable control charts is that the data are normally distributed and spread randomly about the mean. Process data are not always normally distributed, hence there is need to set up appropriate control charts that gives accurate control limits to monitor processes that are skewed. In this study Shewhart-type control charts for monitoring positively skewed data that are assumed to be from Marshall-Olkin Inverse Loglogistic Distribution (MOILLD) was developed. Average Run Length (ARL) and Control Limits Interval (CLI) were adopted to assess the stability and performance of the MOILLD control chart. The results obtained were compared with Classical Shewhart (CS) and Skewness Correction (SC) control charts using the ARL and CLI. It was discovered that the control charts based on MOILLD performed better and are more stable compare to CS and SC control charts. It is therefore recommended that for positively skewed data, a Marshall-Olkin Inverse Loglogistic Distribution based control chart will be more appropriate.


2020 ◽  
Vol 59 (23) ◽  
pp. 10987-10999 ◽  
Author(s):  
Pallavi Kumari ◽  
Dongheon Lee ◽  
Qingsheng Wang ◽  
M. Nazmul Karim ◽  
Joseph Sang-Il Kwon

Author(s):  
Fan Yang ◽  
Sirish Shah ◽  
Deyun Xiao

Signed directed graph based modeling and its validation from process knowledge and process data This paper is concerned with the fusion of information from process data and process connectivity and its subsequent use in fault diagnosis and process hazard assessment. The Signed Directed Graph (SDG), as a graphical model for capturing process topology and connectivity to show the causal relationships between process variables by material and information paths, has been widely used in root cause and hazard propagation analysis. An SDG is usually built based on process knowledge as described by piping and instrumentation diagrams. This is a complex and experience-dependent task, and therefore the resulting SDG should be validated by process data before being used for analysis. This paper introduces two validation methods. One is based on cross-correlation analysis of process data with assumed time delays, while the other is based on transfer entropy, where the correlation coefficient between two variables or the information transfer from one variable to another can be computed to validate the corresponding paths in SDGs. In addition to this, the relationship captured by data-based methods should also be validated by process knowledge to confirm its causality. This knowledge can be realized by checking the reachability or the influence of one variable on another based on the corresponding SDG which is the basis of causality. A case study of an industrial process is presented to illustrate the application of the proposed methods.


Author(s):  
Tarek Iraki ◽  
Norbert Link

AbstractVariations of dedicated process conditions (such as workpiece and tool properties) yield different process state evolutions, which are reflected by different time series of the observable quantities (process curves). A novel method is presented, which firstly allows to extract the statistical influence of these conditions on the process curves and its representation via generative models, and secondly represents their influence on the ensemble of curves by transformations of the representation space. A latent variable space is derived from sampled process data, which represents the curves with only few features. Generative models are formed based on conditional propability functions estimated in this space. Furthermore, the influence of conditions on the ensemble of process curves is represented by estimated transformations of the feature space, which map the process curve densities with different conditions on each other. The latent space is formed via Multi-Task-Learning of an auto-encoder and condition-detectors. The latter classifies the latent space representations of the process curves into the considered conditions. The Bayes framework and the Multi-task Learning models are used to obtain the process curve probabilty densities from the latent space densities. The methods are shown to reveal and represent the influence of combinations of workpiece and tool properties on resistance spot welding process curves.


Author(s):  
Tekin Uyan ◽  
Kalle Jalava ◽  
Juhani Orkas ◽  
Kevin Otto

Abstract Statistical quality control is applied in factories and foundries to identify special cause defects and to identify root causes through statistical correlation of process input variations to defects. A difficulty arises in associating process data collected with individual cast parts as they are worked through the foundry and out into the supply chain. Typically, alphanumeric labels for marking castings and manual identification of the castings with route-paper based tracing approaches have been used. Such manual based systems make root cause analysis of quality defect issues tedious. We here develop a semi-automated approach using 3D printed sand mold inserts shaped as 2D matrix codes which thereby permit directly cast identification code into the parts. This enables automated part tracking at the very beginning of the casting process including mold making. Automated scan-based tracking of parts through a foundry and subsequent supply chain allows for statistical process data collected to also be associated with each part processed with unique identification.


Author(s):  
Peter S. Jackson ◽  
Andreas Fabricius ◽  
Alexandria Wholey

Abstract The root cause of a series of similar failures in SA-213 T91 Superheater tubes of an Heat Recovery Steam Generator (HRSG) is investigated using a combination of engineering analysis and review of process data. The HRSG at the Combined Cycle Gas Turbine (CCGT) Power Plant power plant in question had suffered from frequent tube-to-header fatigue failures over the past 10 years. Metallurgical analyses had never identified any sign of creep damage in, or near, any of the failure locations. Recently, the Gas Turbine (GT) exhaust gas flow pattern upstream of the SH tubes changed slightly. Subsequently there were a large number of HPSH tube to header failures (> 10) on one side of the gas duct. Metallurgical analysis showed that the tube-to-header welds failed by creep-fatigue damage; analyses of tubes from the left-hand side of the boiler did not show any signs of similar damage being present. Further investigation confirmed that the root cause was identified as higher temperatures resulting from small changes in the GT outlet flow pattern.


2021 ◽  
Author(s):  
Catherine, Ye Tang ◽  
Kok Liang Tan ◽  
Latief Riyanto ◽  
Fuziana Tusimin ◽  
Nik Fazril Sapian ◽  
...  

Abstract Well#1 was completed as horizontal oil producer with Openhole Stand-Alone Sandscreens (OHSAS) across a thin reservoir with average thickness of 20ft in Field B. The first Autonomous Inflow Control Device (AICD) in PETRONAS was installed to ensure balanced contribution across horizontal zones with permeability contrasts and to prevent early water and gas breakthrough. Integrated real-time reservoir mapping-while-drilling technology for well placement optimization combined with industry-leading inflow control simulator for AICD placement were opted. The early well tests post drilling showed promising results with production rate doubled the expected rate with no sand production, low water cut and lower Gas to Oil Ratio (GOR). Reservoir Management Plan (RMP) for this oil rim requires continuous gas injection into gas cap and water injection into aquifer. However, due to low gas injection uptime caused by prolonged injection facilities constraints, the well's watercut continued to increase steadily from 0% to 80% within a year of production despite prudent surveillance and controlling of production during injector's downtime. After the gas injection performance has improved, the well was beaned up as part of oil rim management for withdrawal balancing. Unfortunately, a month later, the production rate showed a sudden spike with significantly low wellhead pressure, followed by hairline leak on its choke valve and leak at Crude Oil Transfer Pump (COTP) recycle line. Sand analysis by particle size distribution (PSD) confirmed OHSAS failure, while the high gas rate from well test results confirmed AICD failure. A multidisciplinary investigation team was immediately formed to determine the root cause of the failure event. Root Cause Failure Analysis (RCFA) method was opted to determine the causes of failures, including the reanalyzing of the OHSAS and AICD completion design. The well operating strategy was also reviewed thoroughly by utilizing the well parameters trending provided in the Exceptional Based Surveillance (EBS) Process Information (PI) ProcessBook. Thorough RCFA concluded that frequent platform interruptions and improper well start-up practices have created abrupt pressure changes in the wellbore, which has likely destabilized the natural sand pack around the OHSAS and created frequent burst of sand influx across AICDs. The operating of a high gas-oil ratio (GOR) and high watercut sand prone well without pre-determined AICD sand erosion toleration envelope have also likely contributed to the failure of AICDs. The delay in detection of OHSAS failure in Well#1 due to ineffective sand monitoring method thus resulted in severe sand production which caused severe leak at its choke valve and COTP recycle line.


Entropy ◽  
2020 ◽  
Vol 22 (3) ◽  
pp. 357 ◽  
Author(s):  
Nicholas Carrara ◽  
Kevin Vanslette

Using first principles from inference, we design a set of functionals for the purposes of ranking joint probability distributions with respect to their correlations. Starting with a general functional, we impose its desired behavior through the Principle of Constant Correlations (PCC), which constrains the correlation functional to behave in a consistent way under statistically independent inferential transformations. The PCC guides us in choosing the appropriate design criteria for constructing the desired functionals. Since the derivations depend on a choice of partitioning the variable space into n disjoint subspaces, the general functional we design is the n-partite information (NPI), of which the total correlation and mutual information are special cases. Thus, these functionals are found to be uniquely capable of determining whether a certain class of inferential transformations, ρ → ∗ ρ ′ , preserve, destroy or create correlations. This provides conceptual clarity by ruling out other possible global correlation quantifiers. Finally, the derivation and results allow us to quantify non-binary notions of statistical sufficiency. Our results express what percentage of the correlations are preserved under a given inferential transformation or variable mapping.


Sign in / Sign up

Export Citation Format

Share Document