scholarly journals The Potential of Low-Cost Tin-Oxide Sensors Combined with Machine Learning for Estimating Atmospheric CH4 Variations around Background Concentration

Atmosphere ◽  
2021 ◽  
Vol 12 (1) ◽  
pp. 107
Author(s):  
Rodrigo Rivera Martinez ◽  
Diego Santaren ◽  
Olivier Laurent ◽  
Ford Cropley ◽  
Cécile Mallet ◽  
...  

Continued developments in instrumentation and modeling have driven progress in monitoring methane (CH4) emissions at a range of spatial scales. The sites that emit CH4 such as landfills, oil and gas extraction or storage infrastructure, intensive livestock farms account for a large share of global emissions, and need to be monitored on a continuous basis to verify the effectiveness of reductions policies. Low cost sensors are valuable to monitor methane (CH4) around such facilities because they can be deployed in a large number to sample atmospheric plumes and retrieve emission rates using dispersion models. Here we present two tests of three different versions of Figaro® TGS tin-oxide sensors for estimating CH4 concentrations variations, at levels similar to current atmospheric values, with a sought accuracy of 0.1 to 0.2 ppm. In the first test, we characterize the variation of the resistance of the tin-oxide semi-conducting sensors to controlled levels of CH4, H2O and CO in the laboratory, to analyze cross-sensitivities. In the second test, we reconstruct observed CH4 variations in a room, that ranged from 1.9 and 2.4 ppm during a three month experiment from observed time series of resistances and other variables. To do so, a machine learning model is trained against true CH4 recorded by a high precision instrument. The machine-learning model using 30% of the data for training reconstructs CH4 within the target accuracy of 0.1 ppm only if training variables are representative of conditions during the testing period. The model-derived sensitivities of the sensors resistance to H2O compared to CH4 are larger than those observed under controlled conditions, which deserves further characterization of all the factors influencing the resistance of the sensors.

2021 ◽  
Vol 77 (18) ◽  
pp. 1694
Author(s):  
Rashmi Nedadur ◽  
Zeinab Navidi Ghaziani ◽  
Maala Sooriyakanthan ◽  
Natalie Ho ◽  
Geraldine Ong ◽  
...  

2021 ◽  
Author(s):  
Junhua Huang ◽  
Bohan Zhu ◽  
Hongxi Zhou ◽  
Qiwei Zheng ◽  
Zhuo Chen ◽  
...  

With the continuous expansion of the scale of optical communication network and the rapid increase of network traffic demand, the management form of multi-domain optical network has widely existed. OSNR is an important indicator to judge the quality of communication. It is very important to predict OSNR more accurately in a low-cost and energy-saving way in multi-domain optical networks. In this paper, a scheme of federal learning in multi-domain optical networks is proposed to improve the accuracy of the OSNR prediction. The main idea is to train hybrid machine learning model in each single domain, then the strategy of federal learning is used for optimization it in multi-domains. The performance of the proposed scheme is verified by simulation experiments. The strategy can alleviate the problems of data silos and model training set caused by multi-domain optical network. According to simulation result, when the amount of data reaches 5×103, adding this strategy will reduce the mean square error of the prediction model by about 18%. It can improve the performance of machine learning model, the ability of OSNR prediction and the reliability of network operation.


2021 ◽  
Vol 73 (02) ◽  
pp. 37-39
Author(s):  
Trent Jacobs

For all that logging-while-drilling has provided since its wide-spread adoption in the 1980s, there is one thing on the industry’s wish list that it could never offer: an accurate way to tell the difference between oil and gas. A new technology created by petrotechnicals at Equinor, however, has made this possible. The innovation could be thought of as a pseudo-log, but Equinor is describing it as a reservoir-fluid-identification system. Using an internally developed machine-learning model, it compares a database of more than 4,000 reservoir samples against the real-time analysis of the mud gas that flows up a well as it is drilled. Crunched out of the technology’s various hardware and software components is a prediction on the gas/oil ratio (GOR) that the rock being drilled through will have once it is producing. Since this happens in real time, it boils down to an alert system for when drillers are tapping into uneconomic pay zones. “This is something people have tried to do for 30 years - using partial information to predict entire oil and gas properties,” said Tao Yang. He added that “the data acquisition is rather cheap compared with all the downhole tools, and it doesn’t cost you rig time,” highlighting that the mud-gas analyzer critical to the process sits on a rig or platform without interfering with drilling operations. Yang is a reservoir technology specialist at Equinor and one of the authors of a technical paper (SPE 201323) about the new digital technology that was presented at the SPE Annual Technical Conference and Exhibition in October. He and his colleagues spent more than 3 years building the system which began in the Norwegian oil company’s Houston office as a project to improve pressure/volume/temperature (PVT) analysis in tight-oil wells in North America. It has since found a home in the company’s much larger offshore business unit in Stavanger. Offshore projects designed around certain oil-production targets can face harsh realities when they end up producing more associated gas than expected. It is the difference between drilling an underperforming well full of headaches and one that will pay out hundreds of millions of dollars over its lifetime. By introducing real-time fluid identification, Equinor is trying to enforce a new control on that risk by giving drillers the information they need to pull the bit back and start drilling a side-track deeper into the formation where the odds are better of finding higher proportions of oil or condensates. At the conference, Yang shared details about some of the first field implementations, saying that in most cases the GOR predictions made by the fluid-identification system were confirmed by traditional PVT analysis from the trial wells. Unlike other advancements made on this front, he also said the new approach is the first of its kind to combine such a large database of PVT data with a machine-learning model “that is common to any well.” That means “we do not need to know where this well is located” to make a GOR prediction, said Yang.


2018 ◽  
Author(s):  
Steen Lysgaard ◽  
Paul C. Jennings ◽  
Jens Strabo Hummelshøj ◽  
Thomas Bligaard ◽  
Tejs Vegge

A machine learning model is used as a surrogate fitness evaluator in a genetic algorithm (GA) optimization of the atomic distribution of Pt-Au nanoparticles. The machine learning accelerated genetic algorithm (MLaGA) yields a 50-fold reduction of required energy calculations compared to a traditional GA.


Author(s):  
Dhilsath Fathima.M ◽  
S. Justin Samuel ◽  
R. Hari Haran

Aim: This proposed work is used to develop an improved and robust machine learning model for predicting Myocardial Infarction (MI) could have substantial clinical impact. Objectives: This paper explains how to build machine learning based computer-aided analysis system for an early and accurate prediction of Myocardial Infarction (MI) which utilizes framingham heart study dataset for validation and evaluation. This proposed computer-aided analysis model will support medical professionals to predict myocardial infarction proficiently. Methods: The proposed model utilize the mean imputation to remove the missing values from the data set, then applied principal component analysis to extract the optimal features from the data set to enhance the performance of the classifiers. After PCA, the reduced features are partitioned into training dataset and testing dataset where 70% of the training dataset are given as an input to the four well-liked classifiers as support vector machine, k-nearest neighbor, logistic regression and decision tree to train the classifiers and 30% of test dataset is used to evaluate an output of machine learning model using performance metrics as confusion matrix, classifier accuracy, precision, sensitivity, F1-score, AUC-ROC curve. Results: Output of the classifiers are evaluated using performance measures and we observed that logistic regression provides high accuracy than K-NN, SVM, decision tree classifiers and PCA performs sound as a good feature extraction method to enhance the performance of proposed model. From these analyses, we conclude that logistic regression having good mean accuracy level and standard deviation accuracy compared with the other three algorithms. AUC-ROC curve of the proposed classifiers is analyzed from the output figure.4, figure.5 that logistic regression exhibits good AUC-ROC score, i.e. around 70% compared to k-NN and decision tree algorithm. Conclusion: From the result analysis, we infer that this proposed machine learning model will act as an optimal decision making system to predict the acute myocardial infarction at an early stage than an existing machine learning based prediction models and it is capable to predict the presence of an acute myocardial Infarction with human using the heart disease risk factors, in order to decide when to start lifestyle modification and medical treatment to prevent the heart disease.


Author(s):  
Dhaval Patel ◽  
Shrey Shrivastava ◽  
Wesley Gifford ◽  
Stuart Siegel ◽  
Jayant Kalagnanam ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document