scholarly journals Diagnostic Classification Based on Nonlinear Representation and Filtering of Process Measurement Data

2015 ◽  
Vol 16 (5) ◽  
pp. 3000-3005
Author(s):  
Hyun-Woo Cho
1981 ◽  
Vol 36 (11) ◽  
pp. 1849-1863 ◽  
Author(s):  
J.A. Romagnoli ◽  
G. Stephanopoulos

Vestnik MGSU ◽  
2013 ◽  
pp. 134-142
Author(s):  
A. V. Korgin ◽  
V. A. Emel'yanov ◽  
M. V. Ermakov ◽  
K. L. Z. Zeid ◽  
A. G. Krasochkin ◽  
...  

With the advent of digital video cameras, it has become possible to process measurement data using a computer, which has led to a wide range of new tasks in digital signal processing. Examples of such tasks are perimeter observation and detection of a moving object, recognition of a moving object, and identification of an object. Research and development of algorithms for processing video data, as a rule, requires a time-consuming computational experiment on model and real data of a large volume. Therefore, it is important to correctly organize the computational process when modifying and testing algorithms.


Author(s):  
Joseph Cohen ◽  
Jun Ni

Abstract Machine learning and other data-driven methods have developed at a prolific rate for industrial applications due to the advent of industrial big data. However, industrial datasets may not be especially well-suited to supervised learning approaches that require extensive domain knowledge in the complete and accurate labeling of datasets. To address these challenges, a semi-supervised learning approach is proposed that makes use of partially labeled subsets. The proposed methodology is applied to high-dimensional in-process measurement data, utilizing a convolutional autoencoder for unsupervised feature extraction. A multiclass extension for semi-supervised anomaly diagnosis is proposed that utilizes principal component analysis as the basis for anomaly scoring, and the proposed approach intersects the results of targeted one-against-all phases on partially labeled sets to classify faults. Experiments in a case study on semiconductor manufacturing measurement data are performed to explore the relationship between latent features extracted and anomaly detection performance. The application of the proposed algorithm achieves a true positive detection rate of over 90% with false positive rate under 9% for both local and global anomaly types, with these results accomplished while reducing over 99% of the original input data dimensions. In addition, the approach also allows for positive samples to be identified that were previously undetected by human experts. These results are promising for the application of the proposed semi-supervised methodology in real industrial settings.


Author(s):  
V.A. Gaisky ◽  

The accuracy of field/process reconstruction from the field/process measurement data at the nodes of the space-time lattice is the main quality criterion for the control system. Due to the unreliability of the system, information from the lattice nodes is lost, which is equivalent to its thinning. There is an analytical connection between the error in the field representation and the number of nodes operating with a known probability. Approximate formulas estimating the increment of the error in the representation of processes and fields with power-law spectra with a decrease in the number of nodes N of the space-time lattice for are proposed. Real control systems are multichannel and retain partial useful operability in the event of a failure of a certain number of channels, i.e. they are redundant. Formulas for calculating the reliability of redundant shipborne sounding and towed devices, automatic buoy stations with meters on the horizons and with distributed thermopiles are derived. Towed systems, coastal stations and autonomous buoy stations should operate as long as possible. To build durable systems, the results obtained earlier in the theory of reliability by the author on the ineffectiveness of the static reserve, the fundamental inappropriateness of external diagnostics of faults by observing the inputs and outputs of devices, the possibility of ideal diagnostics of faults by replacing from the dynamic reserve are used. This is the only known method that provides optimal and ideal diagnostics of malfunctions regardless of the structure of the systems and, therefore, allows you to build recoverable and arbitrarily durable systems. The diagnostic method by replacing from the reserve in the software implementation is extended to a set of nodes of the space-time lattice for selecting information from the environment.


Author(s):  
Natarianto Indrawan ◽  
Rupendranath Panday ◽  
Lawrence J. Shadle ◽  
Umesh K. Chitnis

Abstract Data analytics were used to detect boiler leaks from five different coal-fired boilers including both subcritical and supercritical systems. Discriminant functions were developed that detected leaks up to two weeks prior to forced plant shutdowns for repairs. The leaks were identified to occur at different sections of the boiler for each plant, including waterwalls, economizer and superheater using conventional process measurement data. Leaking conditions were detected with a high degree of confidence (≪ 1% misclassified observations) and were able to distinguish normal operations from those time periods with steam leaks even while operating the power plants in power cycling mode. Multivariable statistical analyses, including Principal Component (PCA), cluster, and Fischer Discriminant Analysis (FDA) were used to characterize the leak occurrence. Normal and operational states with steam leaks were provided in the original process datasets. These datasets were split into two different groups for training and validation purposes. The data were sorted chronologically, and every third observation was assigned to training the Discriminant Function Model (DFM) while the rest were reserved for validation. PCA was used to reduce dimensionality of the original datasets. Canonical and FDA analyses were used to investigate the relationship between process variables. The outcome of the analyses revealed that nearly 35,000 observations were classified correctly; less than 0.05% of total observations were misclassified to be leaking, i.e. both false positives and false negatives.


Sign in / Sign up

Export Citation Format

Share Document