scholarly journals Into the thirteenth Month: A Case Study on the Outbreak Analytics and Modeling the spread of SARS-CoV-2 Infection in Pune City, India

Author(s):  
Joy Monteiro ◽  
Bhalchandra Pujari ◽  
Sarika Maitra Bhattacharrya ◽  
Anu Raghunathan ◽  
Ashwini Keskar ◽  
...  

With more than 140 million people infected globally and 3 million deaths, the COVID 19 pandemic has left a lasting impact. A modern response to a pandemic of such proportions needs to focus on exploiting all available data to inform the response in real-time and allow evidence-based decision-making. The intermittent lockdowns in the last 13 months have created economic adversity to prevent anticipated large-scale mortality and relax the lockdowns have been an attempt at recovering and balancing economic needs and public health realities. This article is a comprehensive case study of the outbreak in the city limits of Pune, Maharashtra, India, to understand the evolution of the disease and transmission dynamics starting from the first case on March 9, 2020. A unique collaborative effort between the Pune Municipal Corporation (PMC), a government entity, and the Pune knowledge Cluster (PKC) allowed us to layout a context for outbreak response and intervention. We report here how access to granular data for a metropolitan city with pockets of very high-density populations will help analyze, in real-time, the dynamics of the pandemic and forecasts for better management and control of SARS-CoV-2. Outbreak data analytics resulted in a real-time data visualization dashboard for accurate information dissemination for public access on the epidemic's progress. As government agencies craft testing and vaccination policies and implement intervention strategies to mitigate a second wave, our case study underscores the criticality of data quality and analytics to decode community transmission of COVID-19.

2021 ◽  
Author(s):  
Goedele Verreydt ◽  
Niels Van Putte ◽  
Timothy De Kleyn ◽  
Joris Cool ◽  
Bino Maiheu

<p>Groundwater dynamics play a crucial role in the spreading of a soil and groundwater contamination. However, there is still a big gap in the understanding of the groundwater flow dynamics. Heterogeneities and dynamics are often underestimated and therefore not taken into account. They are of crucial input for successful management and remediation measures. The bulk of the mass of mass often is transported through only a small layer or section within the aquifer and is in cases of seepage into surface water very dependent to rainfall and occurring tidal effects.</p><p> </p><p>This study contains the use of novel real-time iFLUX sensors to map the groundwater flow dynamics over time. The sensors provide real-time data on groundwater flow rate and flow direction. The sensor probes consist of multiple bidirectional flow sensors that are superimposed. The probes can be installed directly in the subsoil, riverbed or monitoring well. The measurement setup is unique as it can perform measurements every second, ideal to map rapid changing flow conditions. The measurement range is between 0,5 and 500 cm per day.</p><p> </p><p>We will present the measurement principles and technical aspects of the sensor, together with two case studies.</p><p> </p><p>The first case study comprises the installation of iFLUX sensors in 4 different monitoring wells in a chlorinated solvent plume to map on the one hand the flow patterns in the plume, and on the other hand the flow dynamics that are influenced by the nearby popular trees. The foreseen remediation concept here is phytoremediation. The sensors were installed for a period of in total 4 weeks. Measurement frequency was 5 minutes. The flow profiles and time series will be presented together with the determined mass fluxes.</p><p> </p><p>A second case study was performed on behalf of the remediation of a canal riverbed. Due to industrial production of tar and carbon black in the past, the soil and groundwater next to the small canal ‘De Lieve’ in Ghent, Belgium, got contaminated with aliphatic and (poly)aromatic hydrocarbons. The groundwater contaminants migrate to the canal, impact the surface water quality and cause an ecological risk. The seepage flow and mass fluxes of contaminants into the surface water were measured with the novel iFLUX streambed sensors, installed directly in the river sediment. A site conceptual model was drawn and dimensioned based on the sensor data. The remediation concept to tackle the inflowing pollution: a hydraulic conductive reactive mat on the riverbed that makes use of the natural draining function of the waterbody, the adsorption capacity of a natural or secondary adsorbent and a future habitat for micro-organisms that biodegrade contaminants. The reactive mats were successfully installed and based on the mass flux calculations a lifespan of at least 10 years is expected for the adsorption material.  </p>


2011 ◽  
Vol 63 (2) ◽  
pp. 233-239 ◽  
Author(s):  
A. Armon ◽  
S. Gutner ◽  
A. Rosenberg ◽  
H. Scolnicov

We report on the design, deployment, and use of TaKaDu, a real-time algorithmic Water Infrastructure Monitoring solution, with a strong focus on water loss reduction and control. TaKaDu is provided as a commercial service to several customers worldwide. It has been in use at HaGihon, the Jerusalem utility, since mid 2009. Water utilities collect considerable real-time data from their networks, e.g. by means of a SCADA system and sensors measuring flow, pressure, and other data. We discuss how an algorithmic statistical solution analyses this wealth of raw data, flexibly using many types of input and picking out and reporting significant events and failures in the network. Of particular interest to most water utilities is the early detection capability for invisible leaks, also a means for preventing large visible bursts. The system also detects sensor and SCADA failures, various water quality issues, DMA boundary breaches, unrecorded or unintended network changes (like a valve or pump state change), and other events, including types unforeseen during system design. We discuss results from use at HaGihon, showing clear operational value.


2017 ◽  
Vol 117 (2) ◽  
pp. 267-286 ◽  
Author(s):  
Abdallah Jamal Dweekat ◽  
Gyusun Hwang ◽  
Jinwoo Park

Purpose The purpose of this paper is to introduce a more practical approach for supply chain performance measurement (SCPM) and to approve the promising role of internet of things (IoT) technologies in SCPM systems. Design/methodology/approach This is a conceptual paper that includes literature review analysis, designing a new approach for SCPM, and a case study scenario for proving its applicability. Findings The case study scenario shows that IoT can enhance SCPM, as it has the capability to enable real-time data collection, increase data efficiency as long as enable real-time communication within the supply chain (SC). Practical implications The proposed approach can help to develop performance measurement systems and applications enabled by IoT technologies. These systems can be used to monitor, manage, and control the overall SC in real time and in a more integrated and cooperative manner. Originality/value This paper provides a structured systems building approach tailored to show how to employ IoT technologies in the field of SCPM. This approach could help in establishing new performance measurement applications, and it is believed that both practitioners and researchers will benefit from it.


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1104
Author(s):  
Shin-Yan Chiou ◽  
Kun-Ju Lin ◽  
Ya-Xin Dong

Positron emission tomography (PET) is one of the commonly used scanning techniques. Medical staff manually calculate the estimated scan time for each PET device. However, the number of PET scanning devices is small, the number of patients is large, and there are many changes including rescanning requirements, which makes it very error-prone, puts pressure on staff, and causes trouble for patients and their families. Although previous studies proposed algorithms for specific inspections, there is currently no research on improving the PET process. This paper proposes a real-time automatic scheduling and control system for PET patients with wearable sensors. The system can automatically schedule, estimate and instantly update the time of various tasks, and automatically allocate beds and announce schedule information in real time. We implemented this system, collected time data of 200 actual patients, and put these data into the implementation program for simulation and comparison. The average time difference between manual and automatic scheduling was 7.32 min, and it could reduce the average examination time of 82% of patients by 6.14 ± 4.61 min. This convinces us the system is correct and can improve time efficiency, while avoiding human error and staff pressure, and avoiding trouble for patients and their families.


2015 ◽  
Vol 43 (3) ◽  
pp. 7-14 ◽  
Author(s):  
Jim Moffatt

Purpose – This case example looks at how Deloitte Consulting applies the Three Rules synthesized by Michael Raynor and Mumtaz Ahmed based on their large-scale research project that identified patterns in the way exceptional companies think. Design/methodology/approach – The Three Rules concept is a key piece of Deloitte Consulting’s thought leadership program. So how are the three rules helping the organization perform? Now that research has shown how exceptional companies think, CEO Jim Moffatt could address the question, “Does Deloitte think like an exceptional company?” Findings – Deloitte has had success with an approach that promotes a bias towards non-price value over price and revenue over costs. Practical implications – It’s critical that all decision makers in an organization understand how decisions that are consistent with the three rules have contributed to past success as well as how they can apply the rules to difficult challenges they face today. Originality/value – This is the first case study written from a CEO’s perspective that looks at how the Three Rules approach of Michael Raynor and Mumtaz Ahmed can foster a firm’s growth and exceptional performance.


Author(s):  
Huijun Wu ◽  
Xiaoyao Qian ◽  
Aleks Shulman ◽  
Kanishk Karanawat ◽  
Tushar Singh ◽  
...  

Author(s):  
Sepehr Fathizadan ◽  
Feng Ju ◽  
Kyle Rowe ◽  
Alex Fiechter ◽  
Nils Hofmann

Abstract Production efficiency and product quality need to be addressed simultaneously to ensure the reliability of large scale additive manufacturing. Specifically, print surface temperature plays a critical role in determining the quality characteristics of the product. Moreover, heat transfer via conduction as a result of spatial correlation between locations on the surface of large and complex geometries necessitates the employment of more robust methodologies to extract and monitor the data. In this paper, we propose a framework for real-time data extraction from thermal images as well as a novel method for controlling layer time during the printing process. A FLIR™ thermal camera captures and stores the stream of images from the print surface temperature while the Thermwood Large Scale Additive Manufacturing (LSAM™) machine is printing components. A set of digital image processing tasks were performed to extract the thermal data. Separate regression models based on real-time thermal imaging data are built on each location on the surface to predict the associated temperatures. Subsequently, a control method is proposed to find the best time for printing the next layer given the predictions. Finally, several scenarios based on the cooling dynamics of surface structure were defined and analyzed, and the results were compared to the current fixed layer time policy. It was concluded that the proposed method can significantly increase the efficiency by reducing the overall printing time while preserving the quality.


Author(s):  
Hamid Khakpour Nejadkhaki ◽  
John F. Hall ◽  
Minghui Zheng ◽  
Teng Wu

A platform for the engineering design, performance, and control of an adaptive wind turbine blade is presented. This environment includes a simulation model, integrative design tool, and control framework. The authors are currently developing a novel blade with an adaptive twist angle distribution (TAD). The TAD influences the aerodynamic loads and thus, system dynamics. The modeling platform facilitates the use of an integrative design tool that establishes the TAD in relation to wind speed. The outcome of this design enables the transformation of the TAD during operation. Still, a robust control method is required to realize the benefits of the adaptive TAD. Moreover, simulation of the TAD is computationally expensive. It also requires a unique approach for both partial and full-load operation. A framework is currently being developed to relate the TAD to the wind turbine and its components. Understanding the relationship between the TAD and the dynamic system is crucial in the establishment of real-time control. This capability is necessary to improve wind capture and reduce system loads. In the current state of development, the platform is capable of maximizing wind capture during partial-load operation. However, the control tasks related to Region 3 and load mitigation are more complex. Our framework will require high-fidelity modeling and reduced-order models that support real-time control. The paper outlines the components of this framework that is being developed. The proposed platform will facilitate expansion and the use of these required modeling techniques. A case study of a 20 kW system is presented based upon the partial-load operation. The study demonstrates how the platform is used to design and control the blade. A low-dimensional aerodynamic model characterizes the blade performance. This interacts with the simulation model to predict the power production. The design tool establishes actuator locations and stiffness properties required for the blade shape to achieve a range of TAD configurations. A supervisory control model is implemented and used to demonstrate how the simulation model blade performs in the case study.


Sign in / Sign up

Export Citation Format

Share Document