Innovative real-time sensing of flow dynamics in groundwater and sediments to map contaminant spreading

Author(s):  
Goedele Verreydt ◽  
Niels Van Putte ◽  
Timothy De Kleyn ◽  
Joris Cool ◽  
Bino Maiheu

<p>Groundwater dynamics play a crucial role in the spreading of a soil and groundwater contamination. However, there is still a big gap in the understanding of the groundwater flow dynamics. Heterogeneities and dynamics are often underestimated and therefore not taken into account. They are of crucial input for successful management and remediation measures. The bulk of the mass of mass often is transported through only a small layer or section within the aquifer and is in cases of seepage into surface water very dependent to rainfall and occurring tidal effects.</p><p> </p><p>This study contains the use of novel real-time iFLUX sensors to map the groundwater flow dynamics over time. The sensors provide real-time data on groundwater flow rate and flow direction. The sensor probes consist of multiple bidirectional flow sensors that are superimposed. The probes can be installed directly in the subsoil, riverbed or monitoring well. The measurement setup is unique as it can perform measurements every second, ideal to map rapid changing flow conditions. The measurement range is between 0,5 and 500 cm per day.</p><p> </p><p>We will present the measurement principles and technical aspects of the sensor, together with two case studies.</p><p> </p><p>The first case study comprises the installation of iFLUX sensors in 4 different monitoring wells in a chlorinated solvent plume to map on the one hand the flow patterns in the plume, and on the other hand the flow dynamics that are influenced by the nearby popular trees. The foreseen remediation concept here is phytoremediation. The sensors were installed for a period of in total 4 weeks. Measurement frequency was 5 minutes. The flow profiles and time series will be presented together with the determined mass fluxes.</p><p> </p><p>A second case study was performed on behalf of the remediation of a canal riverbed. Due to industrial production of tar and carbon black in the past, the soil and groundwater next to the small canal ‘De Lieve’ in Ghent, Belgium, got contaminated with aliphatic and (poly)aromatic hydrocarbons. The groundwater contaminants migrate to the canal, impact the surface water quality and cause an ecological risk. The seepage flow and mass fluxes of contaminants into the surface water were measured with the novel iFLUX streambed sensors, installed directly in the river sediment. A site conceptual model was drawn and dimensioned based on the sensor data. The remediation concept to tackle the inflowing pollution: a hydraulic conductive reactive mat on the riverbed that makes use of the natural draining function of the waterbody, the adsorption capacity of a natural or secondary adsorbent and a future habitat for micro-organisms that biodegrade contaminants. The reactive mats were successfully installed and based on the mass flux calculations a lifespan of at least 10 years is expected for the adsorption material.  </p>

2021 ◽  
Author(s):  
Joy Monteiro ◽  
Bhalchandra Pujari ◽  
Sarika Maitra Bhattacharrya ◽  
Anu Raghunathan ◽  
Ashwini Keskar ◽  
...  

With more than 140 million people infected globally and 3 million deaths, the COVID 19 pandemic has left a lasting impact. A modern response to a pandemic of such proportions needs to focus on exploiting all available data to inform the response in real-time and allow evidence-based decision-making. The intermittent lockdowns in the last 13 months have created economic adversity to prevent anticipated large-scale mortality and relax the lockdowns have been an attempt at recovering and balancing economic needs and public health realities. This article is a comprehensive case study of the outbreak in the city limits of Pune, Maharashtra, India, to understand the evolution of the disease and transmission dynamics starting from the first case on March 9, 2020. A unique collaborative effort between the Pune Municipal Corporation (PMC), a government entity, and the Pune knowledge Cluster (PKC) allowed us to layout a context for outbreak response and intervention. We report here how access to granular data for a metropolitan city with pockets of very high-density populations will help analyze, in real-time, the dynamics of the pandemic and forecasts for better management and control of SARS-CoV-2. Outbreak data analytics resulted in a real-time data visualization dashboard for accurate information dissemination for public access on the epidemic's progress. As government agencies craft testing and vaccination policies and implement intervention strategies to mitigate a second wave, our case study underscores the criticality of data quality and analytics to decode community transmission of COVID-19.


Sensors ◽  
2020 ◽  
Vol 21 (1) ◽  
pp. 50
Author(s):  
Steve H. L. Liang ◽  
Sara Saeedi ◽  
Soroush Ojagh ◽  
Sepehr Honarparvar ◽  
Sina Kiaei ◽  
...  

To safely protect workplaces and the workforce during and after the COVID-19 pandemic, a scalable integrated sensing solution is required in order to offer real-time situational awareness and early warnings for decision-makers. However, an information-based solution for industry reopening is ineffective when the necessary operational information is locked up in disparate real-time data silos. There is a lot of ongoing effort to combat the COVID-19 pandemic using different combinations of low-cost, location-based contact tracing, and sensing technologies. These ad hoc Internet of Things (IoT) solutions for COVID-19 were developed using different data models and protocols without an interoperable way to interconnect these heterogeneous systems and exchange data on people and place interactions. This research aims to design and develop an interoperable Internet of COVID-19 Things (IoCT) architecture that is able to exchange, aggregate, and reuse disparate IoT sensor data sources in order for informed decisions to be made after understanding the real-time risks in workplaces based on person-to-place interactions. The IoCT architecture is based on the Sensor Web paradigm that connects various Things, Sensors, and Datastreams with an indoor geospatial data model. This paper presents a study of what, to the best of our knowledge, is the first real-world integrated implementation of the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) and IndoorGML standards to calculate the risk of COVID-19 online using a workplace reopening case study. The proposed IoCT offers a new open standard-based information model, architecture, methodologies, and software tools that enable the interoperability of disparate COVID-19 monitoring systems with finer spatial-temporal granularity. A workplace cleaning use case was developed in order to demonstrate the capabilities of this proposed IoCT architecture. The implemented IoCT architecture included proximity-based contact tracing, people density sensors, a COVID-19 risky behavior monitoring system, and the contextual building geospatial data.


Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 405
Author(s):  
Marcos Lupión ◽  
Javier Medina-Quero ◽  
Juan F. Sanjuan ◽  
Pilar M. Ortigosa

Activity Recognition (AR) is an active research topic focused on detecting human actions and behaviours in smart environments. In this work, we present the on-line activity recognition platform DOLARS (Distributed On-line Activity Recognition System) where data from heterogeneous sensors are evaluated in real time, including binary, wearable and location sensors. Different descriptors and metrics from the heterogeneous sensor data are integrated in a common feature vector whose extraction is developed by a sliding window approach under real-time conditions. DOLARS provides a distributed architecture where: (i) stages for processing data in AR are deployed in distributed nodes, (ii) temporal cache modules compute metrics which aggregate sensor data for computing feature vectors in an efficient way; (iii) publish-subscribe models are integrated both to spread data from sensors and orchestrate the nodes (communication and replication) for computing AR and (iv) machine learning algorithms are used to classify and recognize the activities. A successful case study of daily activities recognition developed in the Smart Lab of The University of Almería (UAL) is presented in this paper. Results present an encouraging performance in recognition of sequences of activities and show the need for distributed architectures to achieve real time recognition.


Author(s):  
Huijun Wu ◽  
Xiaoyao Qian ◽  
Aleks Shulman ◽  
Kanishk Karanawat ◽  
Tushar Singh ◽  
...  

Author(s):  
Xiangxue Zhao ◽  
Shapour Azarm ◽  
Balakumar Balachandran

Online prediction of dynamical system behavior based on a combination of simulation data and sensor measurement data has numerous applications. Examples include predicting safe flight configurations, forecasting storms and wildfire spread, estimating railway track and pipeline health conditions. In such applications, high-fidelity simulations may be used to accurately predict a system’s dynamical behavior offline (“non-real time”). However, due to the computational expense, these simulations have limited usage for online (“real-time”) prediction of a system’s behavior. To remedy this, one possible approach is to allocate a significant portion of the computational effort to obtain data through offline simulations. The obtained offline data can then be combined with online sensor measurements for online estimation of the system’s behavior with comparable accuracy as the off-line, high-fidelity simulation. The main contribution of this paper is in the construction of a fast data-driven spatiotemporal prediction framework that can be used to estimate general parametric dynamical system behavior. This is achieved through three steps. First, high-order singular value decomposition is applied to map high-dimensional offline simulation datasets into a subspace. Second, Gaussian processes are constructed to approximate model parameters in the subspace. Finally, reduced-order particle filtering is used to assimilate sparsely located sensor data to further improve the prediction. The effectiveness of the proposed approach is demonstrated through a case study. In this case study, aeroelastic response data obtained for an aircraft through simulations is integrated with measurement data obtained from a few sparsely located sensors. Through this case study, the authors show that along with dynamic enhancement of the state estimates, one can also realize a reduction in uncertainty of the estimates.


2020 ◽  
Vol 12 (23) ◽  
pp. 10175
Author(s):  
Fatima Abdullah ◽  
Limei Peng ◽  
Byungchul Tak

The volume of streaming sensor data from various environmental sensors continues to increase rapidly due to wider deployments of IoT devices at much greater scales than ever before. This, in turn, causes massive increase in the fog, cloud network traffic which leads to heavily delayed network operations. In streaming data analytics, the ability to obtain real time data insight is crucial for computational sustainability for many IoT enabled applications such as environmental monitors, pollution and climate surveillance, traffic control or even E-commerce applications. However, such network delays prevent us from achieving high quality real-time data analytics of environmental information. In order to address this challenge, we propose the Fog Sampling Node Selector (Fossel) technique that can significantly reduce the IoT network and processing delays by algorithmically selecting an optimal subset of fog nodes to perform the sensor data sampling. In addition, our technique performs a simple type of query executions within the fog nodes in order to further reduce the network delays by processing the data near the data producing devices. Our extensive evaluations show that Fossel technique outperforms the state-of-the-art in terms of latency reduction as well as in bandwidth consumption, network usage and energy consumption.


Bioanalysis ◽  
2020 ◽  
Vol 12 (20) ◽  
pp. 1449-1458
Author(s):  
Saloumeh K Fischer ◽  
Kathi Williams ◽  
Ian Harmon ◽  
Bryan Bothwell ◽  
Hua Xu ◽  
...  

Aim: Current blood monitoring methods require sample collection and testing at a central lab, which can take days. Point of care (POC) devices with quick turnaround time can provide an alternative with faster results, allowing for real-time data leading to better treatment decisions for patients. Results/Methodology: An assay to measure monoclonal antibody therapeutic-A was developed on two POC devices. Data generated using 75 serum samples (65 clinical & ten spiked samples) show correlative results to the data generated using Gyrolab technology. Conclusion: This case study uses a monoclonal antibody therapeutic-A concentration assay as an example to demonstrate the potential of POC technologies as a viable alternative to central lab testing with quick results allowing for real-time decision-making.


2020 ◽  
Vol 44 (5) ◽  
pp. 677
Author(s):  
Rebekah Eden ◽  
Andrew Burton-Jones ◽  
James Grant ◽  
Renea Collins ◽  
Andrew Staib ◽  
...  

Objective This study aims to assist hospitals contemplating digital transformation by assessing the reported qualitative effects of rapidly implementing an integrated eHealth system in a large Australian hospital and determining whether existing literature offers a reliable framework to assess the effects of digitisation. Methods A qualitative, single-site case study was performed using semistructured interviews supplemented by focus groups, observations and documentation. In all, 92 individuals across medical, nursing, allied health, administrative and executive roles provided insights into the eHealth system, which consisted of an electronic medical record, computerised decision support, computerised physician order entry, ePrescribing systems and wireless device integration. These results were compared against a known framework of the effects of hospital digitisation. Results Diverse, mostly positive, effects were reported, largely consistent with existing literature. Several new effects not reported in literature were reported, namely: (1) improvements in accountability for care, individual career development and time management; (2) mixed findings for the availability of real-time data; and (3) positive findings for the secondary use of data. Conclusions The overall positive perceptions of the effects of digitisation should give confidence to health services contemplating rapid digital transformation. Although existing literature provides a reliable framework for impact assessment, new effects are still emerging, and research and practice need to shift towards understanding how clinicians and hospitals can maximise the benefits of digital transformation. What is known about the topic? Hospitals outside the US are increasingly becoming engaged in eHealth transformations. Yet, the reported effects of these technologies are diverse and mixed with qualitative effects rarely reported. What does this paper add? This study provides a qualitative assessment of the effects of an eHealth transformation at a large Australian tertiary hospital. The results provide renewed confidence in the literature because the findings are largely consistent with expectations from prior systematic reviews of impacts. The qualitative approach followed also resulted in the identification of new effects, which included improvements in accountability, time management and individual development, as well as mixed results for real-time data. In addition, substantial improvements in patient outcomes and clinician productivity were reported from the secondary use of data within the eHealth systems. What are the implications for practitioners? The overall positive findings in this large case study should give confidence to other health services contemplating rapid digital transformation. To achieve substantial benefits, hospitals need to understand how they can best leverage the data within these systems to improve the quality and efficiency of patient care. As such, both research and practice need to shift towards understanding how these systems can be used more effectively.


2020 ◽  
Vol 10 (17) ◽  
pp. 5882
Author(s):  
Federico Desimoni ◽  
Sergio Ilarri ◽  
Laura Po ◽  
Federica Rollo ◽  
Raquel Trillo-Lado

Modern cities face pressing problems with transportation systems including, but not limited to, traffic congestion, safety, health, and pollution. To tackle them, public administrations have implemented roadside infrastructures such as cameras and sensors to collect data about environmental and traffic conditions. In the case of traffic sensor data not only the real-time data are essential, but also historical values need to be preserved and published. When real-time and historical data of smart cities become available, everyone can join an evidence-based debate on the city’s future evolution. The TRAFAIR (Understanding Traffic Flows to Improve Air Quality) project seeks to understand how traffic affects urban air quality. The project develops a platform to provide real-time and predicted values on air quality in several cities in Europe, encompassing tasks such as the deployment of low-cost air quality sensors, data collection and integration, modeling and prediction, the publication of open data, and the development of applications for end-users and public administrations. This paper explicitly focuses on the modeling and semantic annotation of traffic data. We present the tools and techniques used in the project and validate our strategies for data modeling and its semantic enrichment over two cities: Modena (Italy) and Zaragoza (Spain). An experimental evaluation shows that our approach to publish Linked Data is effective.


Sign in / Sign up

Export Citation Format

Share Document