scholarly journals A Digital Twin-Driven Method for Online Quality Control in Process Industry

Author(s):  
Xiaoyang Zhu ◽  
Yangjian Ji

Abstract To ensure the stability of product quality and production continuity, quality control is drawing increasing attention from the process industry. However, current methods cannot meet requirements with regard to time series data, high coupling parameters, delayed data acquisition and ambiguous operation control. A digital twin-driven (DTD) method for real-time monitoring, evaluation and optimization of process parameters that are strongly related to product quality is proposed. Based on a process simulation model, production status information and quality related data are obtained. Combined with an improved genetic algorithm (GA), a time sequential prediction model of bidirectional gated recurrent unit (bi-GRU) with attention mechanism (AM) is built to flexibly allocate parameter weights, accurately predict product quality, timely evaluate technical process and rapidly generate optimized control plans. A typical case study and relevant filed tests from the process industry are presented to prove the effectiveness of the method. Results indicate that the proposed method clearly outperforms its competitors.

Author(s):  
Weifei Hu ◽  
Yihan He ◽  
Zhenyu Liu ◽  
Jianrong Tan ◽  
Ming Yang ◽  
...  

Abstract Precise time series prediction serves as an important role in constructing a Digital Twin (DT). The various internal and external interferences result in highly non-linear and stochastic time series data sampled from real situations. Although artificial Neural Networks (ANNs) are often used to forecast time series for their strong self-learning and nonlinear fitting capabilities, it is a challenging and time-consuming task to obtain the optimal ANN architecture. This paper proposes a hybrid time series prediction model based on ensemble empirical mode decomposition (EEMD), long short-term memory (LSTM) neural networks, and Bayesian optimization (BO). To improve the predictability of stochastic and nonstationary time series, the EEMD method is implemented to decompose the original time series into several components, each of which is composed of single-frequency and stationary signal, and a residual signal. The decomposed signals are used to train the BO-LSTM neural networks, in which the hyper-parameters of the LSTM neural networks are fine-tuned by the BO algorithm. The following time series data are predicted by summating all the predictions of the decomposed signals based on the trained neural networks. To evaluate the performance of the proposed hybrid method (EEMD-BO-LSTM), this paper conducts a case study of wind speed time series prediction and has a comprehensive comparison between the proposed method and other approaches including the persistence model, ARIMA, LSTM neural networks, B0-LSTM neural networks, and EEMD-LSTM neural networks. Results show an improved prediction accuracy using the EEMD-BO-LSTM method by multiple accuracy metrics.


2021 ◽  
Author(s):  
Kay A. Robbins ◽  
Dung Truong ◽  
Stefan Appelhoff ◽  
Arnaud Delorme ◽  
Scott Makeig

Because of the central role that event-related data analysis plays in EEG and MEG (MEEG) experiments, choices about which events to report and how to annotate their full natures can significantly influence the reliability, reproducibility, and value of MEEG datasets for further analysis. Current, more powerful annotation strategies combine robust event description with details of experiment design and metadata in a human-readable as well as machine-actionable form, making event annotation relevant to the full range of neuroimaging and other time series data. This paper dissects the event design and annotation process using as a case study the well-known multi-subject, multimodal dataset of Wakeman and Henson (openneuro.org, ds000117) shared by its authors using Brain Imaging Data Structure (BIDS) formatting (bids.neuroimaging.io). We propose a set of best practices and guidelines for event handling in MEEG research, examine the impact of various design decisions, and provide a working template for organizing events in MEEG and other neuroimaging data. We demonstrate how annotations using the new third-generation formulation of the Hierarchical Event Descriptors (HED-3G) framework and tools (hedtags.org) can document events occurring during neuroimaging experiments and their interrelationships, providing machine-actionable annotation enabling automated both within- and across-study comparisons and analysis, and point to a more complete BIDS formatted, HED-3G annotated edition of the MEEG portion of the Wakeman and Henson dataset (OpenNeuro ds003645).


2021 ◽  
Author(s):  
Kay Robbins ◽  
Dung Truong ◽  
Alexander Jones ◽  
Ian Callanan ◽  
Scott Makeig

AbstractHuman electrophysiological and related time series data are often acquired in complex, event-rich environments. However, the resulting recorded brain or other dynamics are often interpreted in relation to more sparsely recorded or subsequently-noted events. Currently a substantial gap exists between the level of event description required by current digital data archiving standards and the level of annotation required for successful analysis of event-related data across studies, environments, and laboratories. Manifold challenges must be addressed, most prominently ontological clarity, vocabulary extensibility, annotation tool availability, and overall usability, to allow and promote sharing of data with an effective level of descriptive detail for labeled events. Motivating data authors to perform the work needed to adequately annotate their data is a key challenge. This paper describes new developments in the Hierarchical Event Descriptor (HED) system for addressing these issues. We recap the evolution of HED and its acceptance by the Brain Imaging Data Structure (BIDS) movement, describe the recent release of HED-3G, a third generation HED tools and design framework, and discuss directions for future development. Given consistent, sufficiently detailed, tool-enabled, field-relevant annotation of the nature of recorded events, prospects are bright for large-scale analysis and modeling of aggregated time series data, both in behavioral and brain imaging sciences and beyond.


HortScience ◽  
1992 ◽  
Vol 27 (10) ◽  
pp. 1129-1131 ◽  
Author(s):  
J.E. Epperson ◽  
M.C. Chien ◽  
W.O. Mizelle

An analysis was conducted using time-series data to identify possible structural change in the farm-gate demand for South Atlantic fresh peaches [Prunus persica (L.) Batsch.]. Structural change was not found in the price-quantity relationship. However, a failing per capita consumption of South Atlantic fresh peaches was found to be associated with an increase in the per capita consumption of fresh fruits in general. Thus, measures such as promotion and advertising, uniform quality control, and cultivar development may increase the demand for South Atlantic fresh peaches.


Author(s):  
B. Faybishenko ◽  
R. Versteeg ◽  
G. Pastorello ◽  
D. Dwivedi ◽  
C. Varadharajan ◽  
...  

AbstractRepresentativeness and quality of collected meteorological data impact accuracy and precision of climate, hydrological, and biogeochemical analyses and predictions. We developed a comprehensive Quality Assurance (QA) and Quality Control (QC) statistical framework, consisting of three major phases: Phase I—Preliminary data exploration, i.e., processing of raw datasets, with the challenging problems of time formatting and combining datasets of different lengths and different time intervals; Phase II—QA of the datasets, including detecting and flagging of duplicates, outliers, and extreme data; and Phase III—the development of time series of a desired frequency, imputation of missing values, visualization and a final statistical summary. The paper includes two use cases based on the time series data collected at the Billy Barr meteorological station (East River Watershed, Colorado), and the Barro Colorado Island (BCI, Panama) meteorological station. The developed statistical framework is suitable for both real-time and post-data-collection QA/QC analysis of meteorological datasets.


Author(s):  
B M Patel

A growing emphasis on product quality and reliability over price is leading the marketing, engineering, manufacturing and quality functions of many companies to integrate work assignments in developing effective product quality control systems for engineering manufacturing operations. This paper, based on a case study, describes the approach developed for (a) planning an administrative and technical system network for producing and delivering a product of specified quality standards (b) organizing for a product quality control system, (c) integrating the multi-functional activities of the operating components that impact on quality, relating the product quality control system to the manufacturing cycle, and (d) measuring the overall effectiveness of the system on the quality of the products through the evaluation of quality-related data generated by the manufacturing cycle.


Sensors ◽  
2021 ◽  
Vol 22 (1) ◽  
pp. 29
Author(s):  
Manas Bazarbaev ◽  
Tserenpurev Chuluunsaikhan ◽  
Hyoseok Oh ◽  
Ga-Ae Ryu ◽  
Aziz Nasridinov ◽  
...  

Product quality is a major concern in manufacturing. In the metal processing industry, low-quality products must be remanufactured, which requires additional labor, money, and time. Therefore, user-controllable variables for machines and raw material compositions are key factors for ensuring product quality. In this study, we propose a method for generating the time-series working patterns of the control variables for metal-melting induction furnaces and continuous casting machines, thus improving product quality by aiding machine operators. We used an auxiliary classifier generative adversarial network (AC-GAN) model to generate time-series working patterns of two processes depending on product type and additional material data. To check accuracy, the difference between the generated time-series data of the model and the ground truth data was calculated. Specifically, the proposed model results were compared with those of other deep learning models: multilayer perceptron (MLP), convolutional neural network (CNN), long short-term memory (LSTM), and gated recurrent unit (GRU). It was demonstrated that the proposed model outperformed the other deep learning models. Moreover, the proposed method generated different time-series data for different inputs, whereas the other deep learning models generated the same time-series data.


2018 ◽  
Vol 7 (5) ◽  
pp. 19
Author(s):  
Masudul Islam ◽  
Afroza Akhtar ◽  
Sirajum Munira ◽  
Md. Salauddin Khan ◽  
Md Monzur Murshed

Impede nonstationarity is vigorous to study performance of time series data and removes long-term components to expose any regular short-term regularity. So, we find miscellaneous unit root tests for instance Dickey-Fuller test, Augmented Dickey-Fuller plus DF-GLS Tests and identify that almost all unit root tests with the estimated model suffer from sign and boundary problems of the parameters to smooth the progress of the non-stationarity problem. In this paper, we usage Dickey-Fuller test and impose some limits on the parameter. Our proposed optimized DF test based on error sum of square (ESS). Monto Carlo simulation method is used to generate simulated critical values for different sample size. Our proposed optimized DF test gives better result than the ordinary DF test with effectiveness, uniformity and power properties. Also, optimized DF improves the sign and boundary problems through imposing some limit on error sum of squares and capture more nonstationarity of time related data.


Sign in / Sign up

Export Citation Format

Share Document