scholarly journals On IoT-Friendly Skewness Monitoring for Skewness-Aware Online Edge Learning

2021 ◽  
Vol 11 (16) ◽  
pp. 7461
Author(s):  
Zheng Li ◽  
Jhon Galdames-Retamal

Machine learning techniques generally require or assume balanced datasets. Skewed data can make machine learning systems never function properly, no matter how carefully the parameter tuning is conducted. Thus, a common solution to the problem of high skewness is to pre-process data (e.g., log transformation) before applying machine learning to deal with real-world problems. Nevertheless, this pre-processing strategy cannot be employed for online machine learning, especially in the context of edge computing, because it is barely possible to foresee and store the continuous data flow on IoT devices on the edge. Thus, it will be crucial and valuable to enable skewness monitoring in real time. Unfortunately, there exists a surprising gap between practitioners’ needs and scientific research in running statistics for monitoring real-time skewness, not to mention the lack of suitable remedies for skewed data at runtime. Inspired by Welford’s algorithm, which is the most efficient approach to calculating running variance, this research developed efficient calculation methods for three versions of running skewness. These methods can conveniently be implemented as skewness monitoring modules that are affordable for IoT devices in different edge learning scenarios. Such an IoT-friendly skewness monitoring eventually acts a cornerstone for developing the research field of skewness-aware online edge learning. By initially validating the usefulness and significance of skewness awareness in edge learning implementations, we also argue that conjoint research efforts from relevant communities are needed to boost this promising research field.

Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1044
Author(s):  
Yassine Bouabdallaoui ◽  
Zoubeir Lafhaj ◽  
Pascal Yim ◽  
Laure Ducoulombier ◽  
Belkacem Bennadji

The operation and maintenance of buildings has seen several advances in recent years. Multiple information and communication technology (ICT) solutions have been introduced to better manage building maintenance. However, maintenance practices in buildings remain less efficient and lead to significant energy waste. In this paper, a predictive maintenance framework based on machine learning techniques is proposed. This framework aims to provide guidelines to implement predictive maintenance for building installations. The framework is organised into five steps: data collection, data processing, model development, fault notification and model improvement. A sport facility was selected as a case study in this work to demonstrate the framework. Data were collected from different heating ventilation and air conditioning (HVAC) installations using Internet of Things (IoT) devices and a building automation system (BAS). Then, a deep learning model was used to predict failures. The case study showed the potential of this framework to predict failures. However, multiple obstacles and barriers were observed related to data availability and feedback collection. The overall results of this paper can help to provide guidelines for scientists and practitioners to implement predictive maintenance approaches in buildings.


Sensors ◽  
2018 ◽  
Vol 18 (11) ◽  
pp. 3953 ◽  
Author(s):  
Bruno Abade ◽  
David Perez Abreu ◽  
Marilia Curado

Smart Environments try to adapt their conditions focusing on the detection, localisation, and identification of people to improve their comfort. It is common to use different sensors, actuators, and analytic techniques in this kind of environments to process data from the surroundings and actuate accordingly. In this research, a solution to improve the user’s experience in Smart Environments based on information obtained from indoor areas, following a non-intrusive approach, is proposed. We used Machine Learning techniques to determine occupants and estimate the number of persons in a specific indoor space. The solution proposed was tested in a real scenario using a prototype system, integrated by nodes and sensors, specifically designed and developed to gather the environmental data of interest. The results obtained demonstrate that with the developed system it is possible to obtain, process, and store environmental information. Additionally, the analysis performed over the gathered data using Machine Learning and pattern recognition mechanisms shows that it is possible to determine the occupancy of indoor environments.


Electronics ◽  
2021 ◽  
Vol 10 (23) ◽  
pp. 2910
Author(s):  
Andreas Andreou ◽  
Constandinos X. Mavromoustakis ◽  
George Mastorakis ◽  
Jordi Mongay Batalla ◽  
Evangelos Pallis

Various research approaches to COVID-19 are currently being developed by machine learning (ML) techniques and edge computing, either in the sense of identifying virus molecules or in anticipating the risk analysis of the spread of COVID-19. Consequently, these orientations are elaborating datasets that derive either from WHO, through the respective website and research portals, or from data generated in real-time from the healthcare system. The implementation of data analysis, modelling and prediction processing is performed through multiple algorithmic techniques. The lack of these techniques to generate predictions with accuracy motivates us to proceed with this research study, which elaborates an existing machine learning technique and achieves valuable forecasts by modification. More specifically, this study modifies the Levenberg–Marquardt algorithm, which is commonly beneficial for approaching solutions to nonlinear least squares problems, endorses the acquisition of data driven from IoT devices and analyses these data via cloud computing to generate foresight about the progress of the outbreak in real-time environments. Hence, we enhance the optimization of the trend line that interprets these data. Therefore, we introduce this framework in conjunction with a novel encryption process that we are proposing for the datasets and the implementation of mortality predictions.


2021 ◽  
Author(s):  
K. Emma Knowland ◽  
Christoph Keller ◽  
Krzysztof Wargan ◽  
Brad Weir ◽  
Pamela Wales ◽  
...  

<p>NASA's Global Modeling and Assimilation Office (GMAO) produces high-resolution global forecasts for weather, aerosols, and air quality. The NASA Global Earth Observing System (GEOS) model has been expanded to provide global near-real-time 5-day forecasts of atmospheric composition at unprecedented horizontal resolution of 0.25 degrees (~25 km). This composition forecast system (GEOS-CF) combines the operational GEOS weather forecasting model with the state-of-the-science GEOS-Chem chemistry module (version 12) to provide detailed analysis of a wide range of air pollutants such as ozone, carbon monoxide, nitrogen oxides, and fine particulate matter (PM2.5). Satellite observations are assimilated into the system for improved representation of weather and smoke. The assimilation system is being expanded to include chemically reactive trace gases. We discuss current capabilities of the GEOS Constituent Data Assimilation System (CoDAS) to improve atmospheric composition modeling and possible future directions, notably incorporating new observations (TROPOMI, geostationary satellites) and machine learning techniques. We show how machine learning techniques can be used to correct for sub-grid-scale variability, which further improves model estimates at a given observation site.</p>


2021 ◽  
Author(s):  
Aurore Lafond ◽  
Maurice Ringer ◽  
Florian Le Blay ◽  
Jiaxu Liu ◽  
Ekaterina Millan ◽  
...  

Abstract Abnormal surface pressure is typically the first indicator of a number of problematic events, including kicks, losses, washouts and stuck pipe. These events account for 60–70% of all drilling-related nonproductive time, so their early and accurate detection has the potential to save the industry billions of dollars. Detecting these events today requires an expert user watching multiple curves, which can be costly, and subject to human errors. The solution presented in this paper is aiming at augmenting traditional models with new machine learning techniques, which enable to detect these events automatically and help the monitoring of the drilling well. Today’s real-time monitoring systems employ complex physical models to estimate surface standpipe pressure while drilling. These require many inputs and are difficult to calibrate. Machine learning is an alternative method to predict pump pressure, but this alone needs significant labelled training data, which is often lacking in the drilling world. The new system combines these approaches: a machine learning framework is used to enable automated learning while the physical models work to compensate any gaps in the training data. The system uses only standard surface measurements, is fully automated, and is continuously retrained while drilling to ensure the most accurate pressure prediction. In addition, a stochastic (Bayesian) machine learning technique is used, which enables not only a prediction of the pressure, but also the uncertainty and confidence of this prediction. Last, the new system includes a data quality control workflow. It discards periods of low data quality for the pressure anomaly detection and enables to have a smarter real-time events analysis. The new system has been tested on historical wells using a new test and validation framework. The framework runs the system automatically on large volumes of both historical and simulated data, to enable cross-referencing the results with observations. In this paper, we show the results of the automated test framework as well as the capabilities of the new system in two specific case studies, one on land and another offshore. Moreover, large scale statistics enlighten the reliability and the efficiency of this new detection workflow. The new system builds on the trend in our industry to better capture and utilize digital data for optimizing drilling.


2021 ◽  
Author(s):  
Anton Gryzlov ◽  
Liliya Mironova ◽  
Sergey Safonov ◽  
Muhammad Arsalan

Abstract Modern challenges in reservoir management have recently faced new opportunities in production control and optimization strategies. These strategies in turn rely on the availability of monitoring equipment, which is used to obtain production rates in real-time with sufficient accuracy. In particular, a multiphase flow meter is a device for measuring the individual rates of oil, gas and water from a well in real-time without separating fluid phases. Currently, there are several technologies available on the market but multiphase flow meters generally incapable to handle all ranges of operating conditions with satisfactory accuracy in addition to being expensive to maintain. Virtual Flow Metering (VFM) is a mathematical technique for the indirect estimation of oil, gas and water flowrates produced from a well. This method uses more readily available data from conventional sensors, such as downhole pressure and temperature gauges, and calculates the multiphase rates by combining physical multiphase models, various measurement data and an optimization algorithm. In this work, a brief overview of the virtual metering methods is presented, which is followed by the application of several advanced machine-learning techniques for a specific case of multiphase production monitoring in a highly dynamic wellbore. The predictive capabilities of different types of machine learning instruments are explored using a model simulated production data. Also, the effect of measurement noise on the quality of estimates is considered. The presented results demonstrate that the data-driven methods are very capable to predict multiphase flow rates with sufficient accuracy and can be considered as a back-up solution for a conventional multiphase meter.


Sensors ◽  
2020 ◽  
Vol 20 (3) ◽  
pp. 800 ◽  
Author(s):  
Irshad Khan ◽  
Seonhwa Choi ◽  
Young-Woo Kwon

Detecting earthquakes using smartphones or IoT devices in real-time is an arduous and challenging task, not only because it is constrained with the hard real-time issue but also due to the similarity of earthquake signals and the non-earthquake signals (i.e., noise or other activities). Moreover, the variety of human activities also makes it more difficult when a smartphone is used as an earthquake detecting sensor. To that end, in this article, we leverage a machine learning technique with earthquake features rather than traditional seismic methods. First, we split the detection task into two categories including static environment and dynamic environment. Then, we experimentally evaluate different features and propose the most appropriate machine learning model and features for the static environment to tackle the issue of noisy components and detect earthquakes in real-time with less false alarm rates. The experimental result of the proposed model shows promising results not only on the given dataset but also on the unseen data pointing to the generalization characteristics of the model. Finally, we demonstrate that the proposed model can be also used in the dynamic environment if it is trained with different dataset.


Author(s):  
Simen Eldevik ◽  
Stian Sætre ◽  
Erling Katla ◽  
Andreas B. Aardal

Abstract Operators of offshore floating drilling units have limited time to decide on whether a drilling operation can continue as planned or if it needs to be postponed or aborted due to oncoming bad weather. With day-rates of several hundred thousand USD, small delays in the original schedule might amass to considerable costs. On the other hand, pushing the limits of the load capacity of the riser-stack and wellhead may compromise the integrity of the well itself, and such a failure is not an option. Advanced simulation techniques may reduce uncertainty about how different weather scenarios influence the system’s integrity, and thus increase the acceptable weather window considerably. However, real-time simulations are often not feasible and the stochastic behavior of wave-loads make it difficult to simulate all relevant weather scenarios prior to the operation. This paper outlines and demonstrates an approach which utilizes probabilistic machine learning techniques to effectively reduce uncertainty. More specifically we use Gaussian process regression to enable fast approximation of the relevant structural response from complex simulations. The probabilistic nature of the method adds the benefit of an estimated uncertainty in the prediction which can be utilized to optimize how the initial set of relevant simulation scenarios should be selected, and to predict real-time estimates of the utilization and its uncertainty when combined with current weather forecasts. This enables operators to have an up-to-date forecast of the system’s utilization, as well as sufficient time to trigger additional scenario-specific simulation(s) to reduce the uncertainty of the current situation. As a result, it reduces unnecessary conservatism and gives clear decision support for critical situations.


Sign in / Sign up

Export Citation Format

Share Document