scholarly journals Passive Infrared Motion Sensors Improved the Detection Accuracy of Nocturnal Agitation

2021 ◽  
Vol 5 (Supplement_1) ◽  
pp. 955-955
Author(s):  
Wan-Tai Au-Yeung ◽  
Lyndsey Miller ◽  
Zachary Beattie ◽  
Jeffrey Kaye

Abstract Actigraphy has been used to detect agitation in persons with dementia, although this technology must be worn by participants. Another promising sensing methodology is passive infrared (PIR) motion, which provides continuous, low-cost, and unobtrusive data, and may also improve the detection of agitated periods. Using data from the MODERATE (Monitoring Dementia-Related Agitation Using Technology Evaluation) study, we compared the predictive value of detecting agitation in a male participant, who was 64 years old with Alzheimer’s disease (AD), living in a memory care unit, and monitored with actigraphy on his wrist and four PIR motion sensors within his living quarters. The participant’s medical record indicated that he experienced agitation during 17 nights over 96 consecutive days. 929,037 data points were captured for analysis. From each night, the features extracted from the actigraphy wearable included total and standard deviation of activity counts, activity counts in the most and the least active hours, and median activity counts in one hour. Features extracted from the PIR motion sensors included dwell durations in the areas around bed, sofa, front door and bathroom, and the number of transitions between these areas. Using logistic regression to predict agitated periods, comparable classification performances were achieved using these two sets of features (AUC = 0.74 for wearable and AUC = 0.71 for PIR motion sensors). When these two sets of features were combined, the classification performance showed notable improvement (AUC = 0.83). This study points to the value of utilizing PIR motion sensors for detecting dementia-related agitation.

Open Heart ◽  
2021 ◽  
Vol 8 (1) ◽  
pp. e001600
Author(s):  
Joanne Kathryn Taylor ◽  
Haarith Ndiaye ◽  
Matthew Daniels ◽  
Fozia Ahmed

AimsIn response to the COVID-19 pandemic, the UK was placed under strict lockdown measures on 23 March 2020. The aim of this study was to quantify the effects on physical activity (PA) levels using data from the prospective Triage-HF Plus Evaluation study.MethodsThis study represents a cohort of adult patients with implanted cardiac devices capable of measuring activity by embedded accelerometery via a remote monitoring platform. Activity data were available for the 4 weeks pre-implementation and post implementation of ‘stay at home’ lockdown measures in the form of ‘minutes active per day’ (min/day).ResultsData were analysed for 311 patients (77.2% men, mean age 68.8, frailty 55.9%. 92.2% established heart failure (HF) diagnosis, of these 51.2% New York Heart Association II), with comorbidities representative of a real-world cohort.Post-lockdown, a significant reduction in median PA equating to 20.8 active min/day was seen. The reduction was uniform with a slightly more pronounced drop in PA for women, but no statistically significant difference with respect to age, body mass index, frailty or device type. Activity dropped in the immediate 2-week period post-lockdown, but steadily returned thereafter. Median activity week 4 weeks post-lockdown remained significantly lower than 4 weeks pre-lockdown (p≤0.001).ConclusionsIn a population of predominantly HF patients with cardiac devices, activity reduced by approximately 20 min active per day in the immediate aftermath of strict COVID-19 lockdown measures.Trial registration numberNCT04177199.


2020 ◽  
Vol 10 (1) ◽  
pp. 2 ◽  
Author(s):  
Soroush Ojagh ◽  
Sara Saeedi ◽  
Steve H. L. Liang

With the wide availability of low-cost proximity sensors, a large body of research focuses on digital person-to-person contact tracing applications that use proximity sensors. In most contact tracing applications, the impact of SARS-CoV-2 spread through touching contaminated surfaces in enclosed places is overlooked. This study is focused on tracing human contact within indoor places using the open OGC IndoorGML standard. This paper proposes a graph-based data model that considers the semantics of indoor locations, time, and users’ contexts in a hierarchical structure. The functionality of the proposed data model is evaluated for a COVID-19 contact tracing application with scalable system architecture. Indoor trajectory preprocessing is enabled by spatial topology to detect and remove semantically invalid real-world trajectory points. Results show that 91.18% percent of semantically invalid indoor trajectory data points are filtered out. Moreover, indoor trajectory data analysis is innovatively empowered by semantic user contexts (e.g., disinfecting activities) extracted from user profiles. In an enhanced contact tracing scenario, considering the disinfecting activities and sequential order of visiting common places outperformed contact tracing results by filtering out unnecessary potential contacts by 44.98 percent. However, the average execution time of person-to-place contact tracing is increased by 58.3%.


Author(s):  
Sahil Gupta ◽  
Eugene Saltanov ◽  
Igor Pioro

Canada among many other countries is in pursuit of developing next generation (Generation IV) nuclear-reactor concepts. One of the main objectives of Generation-IV concepts is to achieve high thermal efficiencies (45–50%). It has been proposed to make use of SuperCritical Fluids (SCFs) as the heat-transfer medium in such Gen IV reactor design concepts such as SuperCritical Water-cooled Reactor (SCWR). An important aspect towards development of SCF applications in novel Gen IV Nuclear Power Plant (NPP) designs is to understand the thermodynamic behavior and prediction of Heat Transfer Coefficients (HTCs) at supercritical (SC) conditions. To calculate forced convection HTCs for simple geometries, a number of empirical 1-D correlations have been proposed using dimensional analysis. These 1-D HTC correlations are developed by applying data-fitting techniques to a model equation with dimensionless terms and can be used for rudimentary calculations. Using similar statistical techniques three correlations were proposed by Gupta et al. [1] for Heat Transfer (HT) in SCCO2. These SCCO2 correlations were developed at the University of Ontario Institute of Technology (Canada) by using a large set of experimental SCCO2 data (∼4,000 data-points) obtained at the Chalk River Laboratories (CRL) AECL. These correlations predict HTC values with an accuracy of ±30% and wall temperatures with an accuracy of ±20% for the analyzed dataset. Since these correlations were developed using data from a single source - CRL (AECL), they can be limited in their range of applicability. To investigate the tangible applicability of these SCCO2 correlations it was imperative to perform a thorough error analysis by checking their results against a set of independent SCCO2 tube data. In this paper SCCO2 data are compiled from various sources and within various experimental flow conditions. HTC and wall-temperature values for these data points are calculated using updated correlations presented in [1] and compared to the experimental values. Error analysis is then shown for these datasets to obtain a sense of the applicability of these updated SCCO2 correlations.


2014 ◽  
Vol 7 (5) ◽  
pp. 1153-1167 ◽  
Author(s):  
K. Van Tricht ◽  
I. V. Gorodetskaya ◽  
S. Lhermitte ◽  
D. D. Turner ◽  
J. H. Schween ◽  
...  

Abstract. Optically thin ice and mixed-phase clouds play an important role in polar regions due to their effect on cloud radiative impact and precipitation. Cloud-base heights can be detected by ceilometers, low-power backscatter lidars that run continuously and therefore have the potential to provide basic cloud statistics including cloud frequency, base height and vertical structure. The standard cloud-base detection algorithms of ceilometers are designed to detect optically thick liquid-containing clouds, while the detection of thin ice clouds requires an alternative approach. This paper presents the polar threshold (PT) algorithm that was developed to be sensitive to optically thin hydrometeor layers (minimum optical depth τ ≥ 0.01). The PT algorithm detects the first hydrometeor layer in a vertical attenuated backscatter profile exceeding a predefined threshold in combination with noise reduction and averaging procedures. The optimal backscatter threshold of 3 × 10−4 km−1 sr−1 for cloud-base detection near the surface was derived based on a sensitivity analysis using data from Princess Elisabeth, Antarctica and Summit, Greenland. At higher altitudes where the average noise level is higher than the backscatter threshold, the PT algorithm becomes signal-to-noise ratio driven. The algorithm defines cloudy conditions as any atmospheric profile containing a hydrometeor layer at least 90 m thick. A comparison with relative humidity measurements from radiosondes at Summit illustrates the algorithm's ability to significantly discriminate between clear-sky and cloudy conditions. Analysis of the cloud statistics derived from the PT algorithm indicates a year-round monthly mean cloud cover fraction of 72% (±10%) at Summit without a seasonal cycle. The occurrence of optically thick layers, indicating the presence of supercooled liquid water droplets, shows a seasonal cycle at Summit with a monthly mean summer peak of 40 % (±4%). The monthly mean cloud occurrence frequency in summer at Princess Elisabeth is 46% (±5%), which reduces to 12% (±2.5%) for supercooled liquid cloud layers. Our analyses furthermore illustrate the importance of optically thin hydrometeor layers located near the surface for both sites, with 87% of all detections below 500 m for Summit and 80% below 2 km for Princess Elisabeth. These results have implications for using satellite-based remotely sensed cloud observations, like CloudSat that may be insensitive for hydrometeors near the surface. The decrease of sensitivity with height, which is an inherent limitation of the ceilometer, does not have a significant impact on our results. This study highlights the potential of the PT algorithm to extract information in polar regions from various hydrometeor layers using measurements by the robust and relatively low-cost ceilometer instrument.


2020 ◽  
Vol 15 ◽  
pp. 155892502097726
Author(s):  
Wei Wang ◽  
Zhiqiang Pang ◽  
Ling Peng ◽  
Fei Hu

Performing real-time monitoring for human vital signs during sleep at home is of vital importance to achieve timely detection and rescue. However, the existing smart equipment for monitoring human vital signs suffers the drawbacks of high complexity, high cost, and intrusiveness, or low accuracy. Thus, it is of great need to develop a simplified, nonintrusive, comfortable and low cost real-time monitoring system during sleep. In this study, a novel intelligent pillow was developed based on a low-cost piezoelectric ceramic sensor. It was manufactured by locating a smart system (consisting of a sensing unit i.e. a piezoelectric ceramic sensor, a data processing unit and a GPRS communication module) in the cavity of the pillow made of shape memory foam. The sampling frequency of the intelligent pillow was set at 1000 Hz to capture the signals more accurately, and vital signs including heart rate, respiratory rate and body movement were derived through series of well established algorithms, which were sent to the user’s app. Validation experimental results demonstrate that high heart-rate detection accuracy (i.e. 99.18%) was achieved in using the intelligent pillow. Besides, human tests were conducted by detecting vital signs of six elder participants at their home, and results showed that the detected vital signs may well predicate their health conditions. In addition, no contact discomfort was reported by the participants. With further studies in terms of validity of the intelligent pillow and large-scale human trials, the proposed intelligent pillow was expected to play an important role in daily sleep monitoring.


2019 ◽  
Author(s):  
Philip Held ◽  
Randy A Boley ◽  
Walter G Faig ◽  
John A O'Toole ◽  
Imran Desai ◽  
...  

UNSTRUCTURED Electronic health records (EHRs) offer opportunities for research and improvements in patient care. However, challenges exist in using data from EHRs due to the volume of information existing within clinical notes, which can be labor intensive and costly to transform into usable data with existing strategies. This case report details the collaborative development and implementation of the postencounter form (PEF) system into the EHR at the Road Home Program at Rush University Medical Center in Chicago, IL to address these concerns with limited burden to clinical workflows. The PEF system proved to be an effective tool with over 98% of all clinical encounters including a completed PEF within 5 months of implementation. In addition, the system has generated over 325,188 unique, readily-accessible data points in under 4 years of use. The PEF system has since been deployed to other settings demonstrating that the system may have broader clinical utility.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Johnathan Kongoletos ◽  
Ethan Munden ◽  
Jennifer Ballew ◽  
Daniel J. Preston

AbstractVentilation, including fume hoods, consumes 40–70% of the total energy used by modern laboratories. Energy-conscious fume hood usage—for example, closing the sash when a hood is unused—can significantly reduce energy expenditures due to ventilation. Prior approaches to promote such behaviors among lab users have primarily relied on passive feedback methods. In this work, we developed a low-cost fume hood monitoring device with active feedback to alert lab users when a fume hood is left open and unused. Using data collected by the building management system, we observed a 75.6% decrease in the average sash height after installation of these “Motion and Sash Height” (MASH) alarms, which would result in a reduction roughly equal to 43% of the annual carbon emissions of a typical American vehicle, for each fume hood. The MASH alarm presented here reduced energy costs by approximately $1,159 per year, per hood, at MIT.


2021 ◽  
Author(s):  
Mathias Riechel ◽  
Oriol Gutierrez ◽  
Silvia Busquets ◽  
Neus Amela ◽  
Valentina Dimova ◽  
...  

<p>The H2020 innovation project digital-water.city (DWC) aims at boosting the integrated management of water systems in five major European cities – Berlin, Copenhagen, Milan, Paris and Sofia – by leveraging the potential of data and digital technologies. The goal is to quantify the benefits of a panel of 15 innovative digital solutions and achieve their long-term uptake and successful integration in the existing digital systems and governance processes. One of these promising technologies is a new generation of sensors for measuring combined sewer overflow occurrence, developed by ICRA and IoTsens.</p><p>Recent EU regulations have correctly identified CSOs as an important source of contamination and promote appropriate monitoring of all CSO structures in order to control and avoid the detrimental effects on receiving waters. Traditionally there has been a lack of reliable data on the occurrence of CSOs, with the main limitations being: i) the high number of CSO structures per municipality or catchment and ii) the high cost of the flow-monitoring equipment available on the market to measure CSO events. These two factors and the technical constraints of accessing and installing monitoring equipment in some CSO structures have delayed the implementation of extensive monitoring of CSOs. As a result, utilities lack information about the behaviour of the network and potential impacts on the local water bodies.</p><p>The new sensor technology developed by ICRA and IoTsens provides a simple yet robust method for CSO detection based on the deployment of a network of innovative low-cost temperature sensors. The technology reduces CAPEX and OPEX for CSO monitoring, compared to classical flow or water level measurements, and allows utilities to monitor their network extensively. The sensors are installed at the overflows crest and measure air temperature during dry-weather conditions and water temperature when the overflow crest is submerged in case of a CSO event. A CSO event and its duration can be detected by a shift in observed temperature, thanks to the temperature difference between the air and the water phase. Artificial intelligence algorithms further help to convert the continuous measurements into binary information on CSO occurrence. The sensors can quantify the CSO occurrence and duration and remotely provide real-time overflow information through LoRaWAN/2G communication protocols.</p><p>The solution is being deployed since October 2020 in the cities of Sofia, Bulgaria, and Berlin, Germany, with 10 offline sensors installed in each city to improve knowledge on CSO emissions. Further 36 (Sofia) and 9 (Berlin) online sensors will follow this winter. Besides its main goal of improving knowledge on CSO emissions, data in Sofia will also be used to identify suspected dry-weather overflows due to blockages. In Berlin, data will be used to improve the accuracy of an existing hydrodynamic sewer model for resilience analysis, flood forecasting and efficient investment in stormwater management measures. First results show a good detection accuracy of CSO events with the offline version of the technology. As measurements are ongoing and further sensors will be added, an enhanced set of results will be presented at the conference.</p><p>Visit us: https://www.digital-water.city/ </p><p>Follow us: Twitter (@digitalwater_eu); LinkedIn (digital-water.city)</p>


2021 ◽  
Vol 2 (4) ◽  
pp. 1-28
Author(s):  
Anderson Bessa Da Costa ◽  
Larissa Moreira ◽  
Daniel Ciampi De Andrade ◽  
Adriano Veloso ◽  
Nivio Ziviani

Modeling from data usually has two distinct facets: building sound explanatory models or creating powerful predictive models for a system or phenomenon. Most of recent literature does not exploit the relationship between explanation and prediction while learning models from data. Recent algorithms are not taking advantage of the fact that many phenomena are actually defined by diverse sub-populations and local structures, and thus there are many possible predictive models providing contrasting interpretations or competing explanations for the same phenomenon. In this article, we propose to explore a complementary link between explanation and prediction. Our main intuition is that models having their decisions explained by the same factors are likely to perform better predictions for data points within the same local structures. We evaluate our methodology to model the evolution of pain relief in patients suffering from chronic pain under usual guideline-based treatment. The ensembles generated using our framework are compared with all-in-one approaches of robust algorithms to high-dimensional data, such as Random Forests and XGBoost. Chronic pain can be primary or secondary to diseases. Its symptomatology can be classified as nociceptive, nociplastic, or neuropathic, and is generally associated with many different causal structures, challenging typical modeling methodologies. Our data includes 631 patients receiving pain treatment. We considered 338 features providing information about pain sensation, socioeconomic status, and prescribed treatments. Our goal is to predict, using data from the first consultation only, if the patient will be successful in treatment for chronic pain relief. As a result of this work, we were able to build ensembles that are able to consistently improve performance by up to 33% when compared to models trained using all the available features. We also obtained relevant gains in interpretability, with resulting ensembles using only 15% of the total number of features. We show we can effectively generate ensembles from competing explanations, promoting diversity in ensemble learning and leading to significant gains in accuracy by enforcing a stable scenario in which models that are dissimilar in terms of their predictions are also dissimilar in terms of their explanation factors.


Sign in / Sign up

Export Citation Format

Share Document