A machine learning-based surrogate model for the rapid control of piping flow: Application to a natural gas flowmeter calibration system

Author(s):  
Xiong Yin ◽  
Kai Wen ◽  
Yan Wu ◽  
Xu Han ◽  
Yasir Mukhtar ◽  
...  
Author(s):  
Xiong Yin ◽  
Kai Wen ◽  
Yan Wu ◽  
Lei Zhou ◽  
Jing Gong

Abstract In recent years, China ramped up imports of natural gas to satisfy the growing demand, which has increased the number of trade meters. Natural gas flowmeters need to be calibrated regularly at calibration stations to ensure their accuracy. Nowadays, the flow metrological calibration process is done by the operator manually in China, which is easy to be affected by personnel experience and proficiency. China is vigorously developing industry 4.0 and AI(artificial intelligence) technologies. In order to improve the calibration efficiency, a design scheme of intelligent controller for flow metrological calibration system is first proposed in this paper. The intelligent controller can replace the operator for process switching and flow adjustment. First, the controller selects the standard flowmeter according to the type of the calibrated flowmeter, and switches the calibration process. To accurately control the calibration flow for 180 seconds, the controller continuously adjusts the regulating valve with a sequence of commands to the actuator. These commands are generated by intelligent algorithm which is predefined in the controller. Process switching is operated automatically according to flowmeter calibration specifications. In order to reach the required flow point quickly, the flow adjustment is divided into two steps: preliminary adjustment and precise adjustment. For preliminary adjustment, a BP neural network will be built first using the field historical data and simulation results. This neural network describes the relationship between the valve-opening scheme and the calibration flow. Therefore, it could give a calibration flow as close as possible to the expected value during calibration. For precise adjustment, an adaptive PID controller is used. It could adjust the valve opening degree automatically to make sure the flow deviation meet the calibration requirements. Since the PID controller is a self-adaptive PID controller, the process of adjustment is very quick, which can reduce the calibration time largely. After each calibration, both the original neural network and the adaptive function of the controller will be updated to achieve the self-growth. With the information of the calibrated flowmeter, the entire calibration system can run automatically. The experiment in a calibration station shows that the intelligent controller can control the deviation of the flow value within 5% during 4∼5 minutes.


2021 ◽  
Author(s):  
Celestine Udim Monday ◽  
Toyin Olabisi Odutola

Abstract Natural Gas production and transportation are at risk of Gas hydrate plugging especially when in offshore environments where temperature is low and pressure is high. These plugs can eventually block the pipeline, increase back pressure, stop production and ultimately rupture gas pipelines. This study seeks to develops machine learning models after a kinetic inhibitor to predict the gas hydrate formation and pressure changes within the natural gas flow line. Green hydrate inhibitor A, B and C were obtained as plant extracts and applied in low dosages (0.01 wt.% to 0.1 wt.%) on a 12meter skid-mounted hydrate closed flow loop. From the data generated, the optimal dosages of inhibitor A, B and C were observed to be 0.02 wt.%, 0.06 wt.% and 0.1 wt.% respectively. The data associated with these optimal dosages were fed to a set of supervised machine learning algorithms (Extreme gradient boost, Gradient boost regressor and Linear regressor) and a deep learning algorithm (Artificial Neural Network). The output results from the set of supervised learning algorithms and Deep Learning algorithms were compared in terms of their accuracies in predicting the hydrate formation and the pressure within the natural gas flow line. All models had accuracies greater than 90%. This result show that the application Machine learning to solving flow assurance problems is viable. The results show that it is viable to apply machine learning algorithms to solve flow assurance problems, analyzing data and getting reports which can improve accuracy and speed of on-site decision making process.


SIMULATION ◽  
2018 ◽  
Vol 95 (8) ◽  
pp. 673-691 ◽  
Author(s):  
Bong Gu Kang ◽  
Kyung-Min Seo ◽  
Tag Gon Kim

Command and control (C2) and communication are at the heart of successful military operations in network-centric warfare. Interoperable simulation of a C2 system model and a communication (C) system model may be employed to interactively analyze their detailed behaviors. However, such simulation would be inefficient in simulation time for analysis of combat effectiveness of the C2 model against possible input combinations while considering the communication effect in combat operations. This study proposes a discrete event dynamic surrogate model (DEDSM) for the C model, which would be integrated with the C2 model and simulated. The proposed integrated simulation reduces execution time markedly in analysis of combat effectiveness without sacrificing the accuracy reflecting the communication effect. We hypothesize the DEDSM as a probabilistic priority queuing model whose semantics is expressed in a discrete event systems specification model with some characteristic functions unknown. The unknown functions are identified by machine learning with a data set generated by interoperable simulation of the C2 and C models. The case study with the command, control, and communication system of systems first validates the proposed approach through an equivalence test between the interoperable simulation and the proposed one. It then compares the simulation execution times and the number of events exchanged between the two simulations.


Author(s):  
Xianping Du ◽  
Onur Bilgen ◽  
Hongyi Xu

Abstract Machine learning for classification has been used widely in engineering design, for example, feasible domain recognition and hidden pattern discovery. Training an accurate machine learning model requires a large dataset; however, high computational or experimental costs are major issues in obtaining a large dataset for real-world problems. One possible solution is to generate a large pseudo dataset with surrogate models, which is established with a smaller set of real training data. However, it is not well understood whether the pseudo dataset can benefit the classification model by providing more information or deteriorates the machine learning performance due to the prediction errors and uncertainties introduced by the surrogate model. This paper presents a preliminary investigation towards this research question. A classification-and-regressiontree model is employed to recognize the design subspaces to support design decision-making. It is implemented on the geometric design of a vehicle energy-absorbing structure based on finite element simulations. Based on a small set of real-world data obtained by simulations, a surrogate model based on Gaussian process regression is employed to generate pseudo datasets for training. The results showed that the tree-based method could help recognize feasible design domains efficiently. Furthermore, the additional information provided by the surrogate model enhances the accuracy of classification. One important conclusion is that the accuracy of the surrogate model determines the quality of the pseudo dataset and hence, the improvements in the machine learning model.


Sign in / Sign up

Export Citation Format

Share Document