Machine Learning Based IoT Geriatric Fall Intelligent System in Pandemic (Preprint)

2021 ◽  
Author(s):  
Gowri R ◽  
Rathipriya R

UNSTRUCTURED In the current pandemic, there is lack of medical care takers and physicians in hospitals and health centers. The patients other than COVID infected are also affected by this scenario. Besides, the hospitals are also not admitting the old age peoples, and they are scared to approach hospitals even for their basic health checkups. But, they have to be cared and monitored to avoid the risk factors like fall incidence which may cause fatal injury. In such a case, this paper focuses on the cloud based IoT gadget for early fall incidence prediction. It is machine learning based fall incidence prediction system for the old age patients. The approaches such as Logistic Regression, Naive Bayes, Stochastic Gradient Descent, Decision Tree, Random Forest, Support Vector Machines, K-Nearest Neighbor and ensemble learning boosting techniques, i.e., XGBoost are used for fall incidence prediction. The proposed approach is first tested on the benchmark activity sensor data with different features for training purpose. The real-time vital signs like heart rate, blood pressure are recorded and stored in cloud and the machine learning approaches are applied to it. Then tested on the real-time sensor data like heart rate and blood pressure data of geriatric patients to predict early fall.

2021 ◽  
Author(s):  
S. H. Al Gharbi ◽  
A. A. Al-Majed ◽  
A. Abdulraheem ◽  
S. Patil ◽  
S. M. Elkatatny

Abstract Due to high demand for energy, oil and gas companies started to drill wells in remote areas and unconventional environments. This raised the complexity of drilling operations, which were already challenging and complex. To adapt, drilling companies expanded their use of the real-time operation center (RTOC) concept, in which real-time drilling data are transmitted from remote sites to companies’ headquarters. In RTOC, groups of subject matter experts monitor the drilling live and provide real-time advice to improve operations. With the increase of drilling operations, processing the volume of generated data is beyond a human's capability, limiting the RTOC impact on certain components of drilling operations. To overcome this limitation, artificial intelligence and machine learning (AI/ML) technologies were introduced to monitor and analyze the real-time drilling data, discover hidden patterns, and provide fast decision-support responses. AI/ML technologies are data-driven technologies, and their quality relies on the quality of the input data: if the quality of the input data is good, the generated output will be good; if not, the generated output will be bad. Unfortunately, due to the harsh environments of drilling sites and the transmission setups, not all of the drilling data is good, which negatively affects the AI/ML results. The objective of this paper is to utilize AI/ML technologies to improve the quality of real-time drilling data. The paper fed a large real-time drilling dataset, consisting of over 150,000 raw data points, into Artificial Neural Network (ANN), Support Vector Machine (SVM) and Decision Tree (DT) models. The models were trained on the valid and not-valid datapoints. The confusion matrix was used to evaluate the different AI/ML models including different internal architectures. Despite the slowness of ANN, it achieved the best result with an accuracy of 78%, compared to 73% and 41% for DT and SVM, respectively. The paper concludes by presenting a process for using AI technology to improve real-time drilling data quality. To the author's knowledge based on literature in the public domain, this paper is one of the first to compare the use of multiple AI/ML techniques for quality improvement of real-time drilling data. The paper provides a guide for improving the quality of real-time drilling data.


2017 ◽  
Vol 10 (2) ◽  
pp. 130-144 ◽  
Author(s):  
Iwan Aang Soenandi ◽  
Taufik Djatna ◽  
Ani Suryani ◽  
Irzaman Irzaman

Purpose The production of glycerol derivatives by the esterification process is subject to many constraints related to the yield of the production target and the lack of process efficiency. An accurate monitoring and controlling of the process can improve production yield and efficiency. The purpose of this paper is to propose a real-time optimization (RTO) using gradient adaptive selection and classification from infrared sensor measurement to cover various disturbances and uncertainties in the reactor. Design/methodology/approach The integration of the esterification process optimization using self-optimization (SO) was developed with classification process was combined with necessary condition optimum (NCO) as gradient adaptive selection, supported with laboratory scaled medium wavelength infrared (mid-IR) sensors, and measured the proposed optimization system indicator in the batch process. Business Process Modeling and Notation (BPMN 2.0) was built to describe the tasks of SO workflow in collaboration with NCO as an abstraction for the conceptual phase. Next, Stateflow modeling was deployed to simulate the three states of gradient-based adaptive control combined with support vector machine (SVM) classification and Arduino microcontroller for implementation. Findings This new method shows that the real-time optimization responsiveness of control increased product yield up to 13 percent, lower error measurement with percentage error 1.11 percent, reduced the process duration up to 22 minutes, with an effective range of stirrer rotation set between 300 and 400 rpm and final temperature between 200 and 210°C which was more efficient, as it consumed less energy. Research limitations/implications In this research the authors just have an experiment for the esterification process using glycerol, but as a development concept of RTO, it would be possible to apply for another chemical reaction or system. Practical implications This research introduces new development of an RTO approach to optimal control and as such marks the starting point for more research of its properties. As the methodology is generic, it can be applied to different optimization problems for a batch system in chemical industries. Originality/value The paper presented is original as it presents the first application of adaptive selection based on the gradient value of mid-IR sensor data, applied to the real-time determining control state by classification with the SVM algorithm for esterification process control to increase the efficiency.


Sensors ◽  
2021 ◽  
Vol 21 (8) ◽  
pp. 2801
Author(s):  
Hasan Asy’ari Arief ◽  
Tomasz Wiktorski ◽  
Peter James Thomas

Real-time monitoring of multiphase fluid flows with distributed fibre optic sensing has the potential to play a major role in industrial flow measurement applications. One such application is the optimization of hydrocarbon production to maximize short-term income, and prolong the operational lifetime of production wells and the reservoir. While the measurement technology itself is well understood and developed, a key remaining challenge is the establishment of robust data analysis tools that are capable of providing real-time conversion of enormous data quantities into actionable process indicators. This paper provides a comprehensive technical review of the data analysis techniques for distributed fibre optic technologies, with a particular focus on characterizing fluid flow in pipes. The review encompasses classical methods, such as the speed of sound estimation and Joule-Thomson coefficient, as well as their data-driven machine learning counterparts, such as Convolutional Neural Network (CNN), Support Vector Machine (SVM), and Ensemble Kalman Filter (EnKF) algorithms. The study aims to help end-users establish reliable, robust, and accurate solutions that can be deployed in a timely and effective way, and pave the wave for future developments in the field.


2020 ◽  
Vol 26 (9) ◽  
pp. 1128-1147
Author(s):  
Ranjan Behera ◽  
Sushree Das ◽  
Santanu Rath ◽  
Sanjay Misra ◽  
Robertas Damasevicius

Stock prediction is one of the emerging applications in the field of data science which help the companies to make better decision strategy. Machine learning models play a vital role in the field of prediction. In this paper, we have proposed various machine learning models which predicts the stock price from the real-time streaming data. Streaming data has been a potential source for real-time prediction which deals with continuous ow of data having information from various sources like social networking websites, server logs, mobile phone applications, trading oors etc. We have adopted the distributed platform, Spark to analyze the streaming data collected from two different sources as represented in two case studies in this paper. The first case study is based on stock prediction from the historical data collected from Google finance websites through NodeJs and the second one is based on the sentiment analysis of Twitter collected through Twitter API available in Stanford NLP package. Several researches have been made in developing models for stock prediction based on static data. In this work, an effort has been made to develop scalable, fault tolerant models for stock prediction from the real-time streaming data. The Proposed model is based on a distributed architecture known as Lambda architecture. The extensive comparison is made between actual and predicted output for different machine learning models. Support vector regression is found to have better accuracy as compared to other models. The historical data is considered as a ground truth data for validation.


2020 ◽  
Author(s):  
Nalika Ulapane ◽  
Karthick Thiyagarajan ◽  
sarath kodagoda

<div>Classification has become a vital task in modern machine learning and Artificial Intelligence applications, including smart sensing. Numerous machine learning techniques are available to perform classification. Similarly, numerous practices, such as feature selection (i.e., selection of a subset of descriptor variables that optimally describe the output), are available to improve classifier performance. In this paper, we consider the case of a given supervised learning classification task that has to be performed making use of continuous-valued features. It is assumed that an optimal subset of features has already been selected. Therefore, no further feature reduction, or feature addition, is to be carried out. Then, we attempt to improve the classification performance by passing the given feature set through a transformation that produces a new feature set which we have named the “Binary Spectrum”. Via a case study example done on some Pulsed Eddy Current sensor data captured from an infrastructure monitoring task, we demonstrate how the classification accuracy of a Support Vector Machine (SVM) classifier increases through the use of this Binary Spectrum feature, indicating the feature transformation’s potential for broader usage.</div><div><br></div>


Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4655
Author(s):  
Dariusz Czerwinski ◽  
Jakub Gęca ◽  
Krzysztof Kolano

In this article, the authors propose two models for BLDC motor winding temperature estimation using machine learning methods. For the purposes of the research, measurements were made for over 160 h of motor operation, and then, they were preprocessed. The algorithms of linear regression, ElasticNet, stochastic gradient descent regressor, support vector machines, decision trees, and AdaBoost were used for predictive modeling. The ability of the models to generalize was achieved by hyperparameter tuning with the use of cross-validation. The conducted research led to promising results of the winding temperature estimation accuracy. In the case of sensorless temperature prediction (model 1), the mean absolute percentage error MAPE was below 4.5% and the coefficient of determination R2 was above 0.909. In addition, the extension of the model with the temperature measurement on the casing (model 2) allowed reducing the error value to about 1% and increasing R2 to 0.990. The results obtained for the first proposed model show that the overheating protection of the motor can be ensured without direct temperature measurement. In addition, the introduction of a simple casing temperature measurement system allows for an estimation with accuracy suitable for compensating the motor output torque changes related to temperature.


Author(s):  
Negin Yousefpour ◽  
Steve Downie ◽  
Steve Walker ◽  
Nathan Perkins ◽  
Hristo Dikanski

Bridge scour is a challenge throughout the U.S.A. and other countries. Despite the scale of the issue, there is still a substantial lack of robust methods for scour prediction to support reliable, risk-based management and decision making. Throughout the past decade, the use of real-time scour monitoring systems has gained increasing interest among state departments of transportation across the U.S.A. This paper introduces three distinct methodologies for scour prediction using advanced artificial intelligence (AI)/machine learning (ML) techniques based on real-time scour monitoring data. Scour monitoring data included the riverbed and river stage elevation time series at bridge piers gathered from various sources. Deep learning algorithms showed promising in prediction of bed elevation and water level variations as early as a week in advance. Ensemble neural networks proved successful in the predicting the maximum upcoming scour depth, using the observed sensor data at the onset of a scour episode, and based on bridge pier, flow and riverbed characteristics. In addition, two of the common empirical scour models were calibrated based on the observed sensor data using the Bayesian inference method, showing significant improvement in prediction accuracy. Overall, this paper introduces a novel approach for scour risk management by integrating emerging AI/ML algorithms with real-time monitoring systems for early scour forecast.


2021 ◽  
Author(s):  
Arturo Magana-Mora ◽  
Mohammad AlJubran ◽  
Jothibasu Ramasamy ◽  
Mohammed AlBassam ◽  
Chinthaka Gooneratne ◽  
...  

Abstract Objective/Scope. Lost circulation events (LCEs) are among the top causes for drilling nonproductive time (NPT). The presence of natural fractures and vugular formations causes loss of drilling fluid circulation. Drilling depleted zones with incorrect mud weights can also lead to drilling induced losses. LCEs can also develop into additional drilling hazards, such as stuck pipe incidents, kicks, and blowouts. An LCE is traditionally diagnosed only when there is a reduction in mud volume in mud pits in the case of moderate losses or reduction of mud column in the annulus in total losses. Using machine learning (ML) for predicting the presence of a loss zone and the estimation of fracture parameters ahead is very beneficial as it can immediately alert the drilling crew in order for them to take the required actions to mitigate or cure LCEs. Methods, Procedures, Process. Although different computational methods have been proposed for the prediction of LCEs, there is a need to further improve the models and reduce the number of false alarms. Robust and generalizable ML models require a sufficiently large amount of data that captures the different parameters and scenarios representing an LCE. For this, we derived a framework that automatically searches through historical data, locates LCEs, and extracts the surface drilling and rheology parameters surrounding such events. Results, Observations, and Conclusions. We derived different ML models utilizing various algorithms and evaluated them using the data-split technique at the level of wells to find the most suitable model for the prediction of an LCE. From the model comparison, random forest classifier achieved the best results and successfully predicted LCEs before they occurred. The developed LCE model is designed to be implemented in the real-time drilling portal as an aid to the drilling engineers and the rig crew to minimize or avoid NPT. Novel/Additive Information. The main contribution of this study is the analysis of real-time surface drilling parameters and sensor data to predict an LCE from a statistically representative number of wells. The large-scale analysis of several wells that appropriately describe the different conditions before an LCE is critical for avoiding model undertraining or lack of model generalization. Finally, we formulated the prediction of LCEs as a time-series problem and considered parameter trends to accurately determine the early signs of LCEs.


Author(s):  
Yourui Tong ◽  
Bochen Jia ◽  
Yi Wang ◽  
Si Yang

To help automated vehicles learn surrounding environments via V2X communications, it is important to detect and transfer pedestrian situation awareness to the related vehicles. Based on the characteristics of pedestrians, a real-time algorithm was developed to detect pedestrian situation awareness. In the study, the heart rate variability (HRV) and phone position were used to understand the mental state and distractions of pedestrians. The HRV analysis was used to detect the fatigue and alert state of the pedestrian, and the phone position was used to define the phone distractions of the pedestrian. A Support Vector Machine algorithm was used to classify the pedestrian’s mental state. The results indicated a good performance with 86% prediction accuracy. The developed algorithm shows high applicability to detect the pedestrian’s situation awareness in real-time, which would further extend our understanding on V2X employment and automated vehicle design.


2017 ◽  
Vol 8 (2) ◽  
pp. 88-105 ◽  
Author(s):  
Gunasekaran Manogaran ◽  
Daphne Lopez

Ambient intelligence is an emerging platform that provides advances in sensors and sensor networks, pervasive computing, and artificial intelligence to capture the real time climate data. This result continuously generates several exabytes of unstructured sensor data and so it is often called big climate data. Nowadays, researchers are trying to use big climate data to monitor and predict the climate change and possible diseases. Traditional data processing techniques and tools are not capable of handling such huge amount of climate data. Hence, there is a need to develop advanced big data architecture for processing the real time climate data. The purpose of this paper is to propose a big data based surveillance system that analyzes spatial climate big data and performs continuous monitoring of correlation between climate change and Dengue. Proposed disease surveillance system has been implemented with the help of Apache Hadoop MapReduce and its supporting tools.


Sign in / Sign up

Export Citation Format

Share Document