SoilMAP: An Open Source Python Library for Developing Algorithms and Specialized User Interfaces that Integrate Multiple Disparate Data Sources Including Near-Real-Time Sensor Data for Streamlined Monitoring of Experiments and Analysis.

2020 ◽  
Author(s):  
Jerry Bieszczad ◽  
Mattheus Ueckermann ◽  
Rachel Gilmore ◽  
Marek Zreda
Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 2944
Author(s):  
Benjamin James Ralph ◽  
Marcel Sorger ◽  
Benjamin Schödinger ◽  
Hans-Jörg Schmölzer ◽  
Karin Hartl ◽  
...  

Smart factories are an integral element of the manufacturing infrastructure in the context of the fourth industrial revolution. Nevertheless, there is frequently a deficiency of adequate training facilities for future engineering experts in the academic environment. For this reason, this paper describes the development and implementation of two different layer architectures for the metal processing environment. The first architecture is based on low-cost but resilient devices, allowing interested parties to work with mostly open-source interfaces and standard back-end programming environments. Additionally, one proprietary and two open-source graphical user interfaces (GUIs) were developed. Those interfaces can be adapted front-end as well as back-end, ensuring a holistic comprehension of their capabilities and limits. As a result, a six-layer architecture, from digitization to an interactive project management tool, was designed and implemented in the practical workflow at the academic institution. To take the complexity of thermo-mechanical processing in the metal processing field into account, an alternative layer, connected with the thermo-mechanical treatment simulator Gleeble 3800, was designed. This framework is capable of transferring sensor data with high frequency, enabling data collection for the numerical simulation of complex material behavior under high temperature processing. Finally, the possibility of connecting both systems by using open-source software packages is demonstrated.


2021 ◽  
Author(s):  
Ashabikash Roy Chowdhury ◽  
Matthew Forshaw ◽  
Narender Atwal ◽  
Matthias Gatzen ◽  
Salman Habib ◽  
...  

Abstract In the increasingly complex and cost sensitive drilling environment of today, data gathered using downhole and surface real-time sensor systems must work in unison with physics-based models to facilitate early indication of drilling hazards, allowing timely action and mitigation. Identification of opportunities for reduction of invisible lost time (ILT) is similarly critical. Many similar systems gather and analyze either surface or downhole data on a standalone basis but lack the integrated approach towards using the data in a holistic decision-making manner. These systems can either paint an incomplete picture of prevailing drilling conditions or fail to ensure system messages result in parameter changes at rigsite. This often results in a hit or miss approach in identification and mitigation of drilling problems. The automated software system architecture is described, detailing the physics-based models which are deployed in real-time consuming surface and downhole sensor data and outputting continuous, operationally relevant simulation results. Measured data from either surface, for torque & drag, or downhole for ECD & ESD is then automatically compared both for deviation of actual-to-plan, and for infringement of boundary conditions such as formation pressure regime. The system is also equipped to model off-bottom induced pressures; swab & surge, and dynamically advise on safe, but optimum tripping velocities for the operation at hand. This has dual benefits; both the avoidance of costly NPT associated with swab & surge, as well as being able to visually highlight running speed ILT. All processing applications are coupled with highly intuitive user interfaces. Three successful deployments all onshore in the Middle East are detailed. First a horizontal section where real-time model vs. actual automatic comparison of torque & drag samples, validated with PWD data allowed early identification of poor hole cleaning. Secondly, a vertical section where again the model vs. actual algorithmic automatically identified inadequate hole cleaning in a case where conventional human monitoring did not. Finally, a case is exhibited where real-time modelling of swab and surge, as well as intuitive visualization of the trip speeds within those boundary conditions led to a significant increase in average tripping speeds when compared to offset wells, reducing AFE for the operator. Common for all three deployments was an integrated well services approach, with a single service company providing the majority of services for well construction, as well as an overarching remote operations team who were primary users of the software solutions deployed.


Author(s):  
Yuandong Liu ◽  
Zhihua Zhang ◽  
Lee D. Han ◽  
Candace Brakewood

Traffic queues, especially queues caused by non-recurrent events such as incidents, are unexpected to high-speed drivers approaching the end of queue (EOQ) and become safety concerns. Though the topic has been extensively studied, the identification of EOQ has been limited by the spatial-temporal resolution of traditional data sources. This study explores the potential of location-based crowdsourced data, specifically Waze user reports. It presents a dynamic clustering algorithm that can group the location-based reports in real time and identify the spatial-temporal extent of congestion as well as the EOQ. The algorithm is a spatial-temporal extension of the density-based spatial clustering of applications with noise (DBSCAN) algorithm for real-time streaming data with an adaptive threshold selection procedure. The proposed method was tested with 34 traffic congestion cases in the Knoxville,Tennessee area of the United States. It is demonstrated that the algorithm can effectively detect spatial-temporal extent of congestion based on Waze report clusters and identify EOQ in real-time. The Waze report-based detection are compared to the detection based on roadside sensor data. The results are promising: The EOQ identification time of Waze is similar to the EOQ detection time of traffic sensor data, with only 1.1 min difference on average. In addition, Waze generates 1.9 EOQ detection points every mile, compared to 1.8 detection points generated by traffic sensor data, suggesting the two data sources are comparable in respect of reporting frequency. The results indicate that Waze is a valuable complementary source for EOQ detection where no traffic sensors are installed.


2016 ◽  
Author(s):  
Boris Simovski ◽  
Daniel Vodak ◽  
Sveinung Gundersen ◽  
Diana Domanska ◽  
Abdulrahman Azab ◽  
...  

AbstractGenome-wide, cell-type-specific profiles are being systematically generated for numerous genomic and epigenomic features. There is, however, no universally applicable analytical methodology for such data. We present GSuite HyperBrowser, the first comprehensive solution for integrative analysis of dataset collections across the genome and epigenome. The GSuite HyperBrowser is an open-source system for streamlined acquisition and customizable statistical analysis of large collections of genome-wide datasets. The system is based on new computational and statistical methodologies that permit comparative and confirmatory analyses across multiple disparate data sources. Expert guidance and reproducibility are facilitated via a Galaxy-based web-interface. The software is available athttps://hyperbrowser.uio.no/gsuite


Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 405
Author(s):  
Marcos Lupión ◽  
Javier Medina-Quero ◽  
Juan F. Sanjuan ◽  
Pilar M. Ortigosa

Activity Recognition (AR) is an active research topic focused on detecting human actions and behaviours in smart environments. In this work, we present the on-line activity recognition platform DOLARS (Distributed On-line Activity Recognition System) where data from heterogeneous sensors are evaluated in real time, including binary, wearable and location sensors. Different descriptors and metrics from the heterogeneous sensor data are integrated in a common feature vector whose extraction is developed by a sliding window approach under real-time conditions. DOLARS provides a distributed architecture where: (i) stages for processing data in AR are deployed in distributed nodes, (ii) temporal cache modules compute metrics which aggregate sensor data for computing feature vectors in an efficient way; (iii) publish-subscribe models are integrated both to spread data from sensors and orchestrate the nodes (communication and replication) for computing AR and (iv) machine learning algorithms are used to classify and recognize the activities. A successful case study of daily activities recognition developed in the Smart Lab of The University of Almería (UAL) is presented in this paper. Results present an encouraging performance in recognition of sequences of activities and show the need for distributed architectures to achieve real time recognition.


Author(s):  
Negin Yousefpour ◽  
Steve Downie ◽  
Steve Walker ◽  
Nathan Perkins ◽  
Hristo Dikanski

Bridge scour is a challenge throughout the U.S.A. and other countries. Despite the scale of the issue, there is still a substantial lack of robust methods for scour prediction to support reliable, risk-based management and decision making. Throughout the past decade, the use of real-time scour monitoring systems has gained increasing interest among state departments of transportation across the U.S.A. This paper introduces three distinct methodologies for scour prediction using advanced artificial intelligence (AI)/machine learning (ML) techniques based on real-time scour monitoring data. Scour monitoring data included the riverbed and river stage elevation time series at bridge piers gathered from various sources. Deep learning algorithms showed promising in prediction of bed elevation and water level variations as early as a week in advance. Ensemble neural networks proved successful in the predicting the maximum upcoming scour depth, using the observed sensor data at the onset of a scour episode, and based on bridge pier, flow and riverbed characteristics. In addition, two of the common empirical scour models were calibrated based on the observed sensor data using the Bayesian inference method, showing significant improvement in prediction accuracy. Overall, this paper introduces a novel approach for scour risk management by integrating emerging AI/ML algorithms with real-time monitoring systems for early scour forecast.


Sign in / Sign up

Export Citation Format

Share Document