state data
Recently Published Documents





2022 ◽  
pp. 0272989X2110699
Louise B. Russell ◽  
Qian Huang ◽  
Yuqing Lin ◽  
Laurie A. Norton ◽  
Jingsan Zhu ◽  

Introduction. Pragmatic clinical trials test interventions in patients representative of real-world medical practice and reduce data collection costs by using data recorded in the electronic health record (EHR) during usual care. We describe our experience using the EHR to measure the primary outcome of a pragmatic trial, hospital readmissions, and important clinical covariates. Methods. The trial enrolled patients recently discharged from the hospital for treatment of heart failure to test whether automated daily monitoring integrated into the EHR could reduce readmissions. The study team used data from the EHR and several data systems that drew on the EHR, supplemented by the hospital admissions files of three states. Results. Almost three-quarters of enrollees’ readmissions over the 12-mo trial period were captured by the EHRs of the study hospitals. State data, which took 7 mo to more than 2 y from first contact to receipt of first data, provided the remaining one-quarter. Considerable expertise was required to resolve differences between the 2 data sources. Common covariates used in trial analyses, such as weight and body mass index during the index hospital stay, were available for >97% of enrollees from the EHR. Ejection fraction, obtained from echocardiograms, was available for only 47.6% of enrollees within the 6-mo window that would likely be expected in a traditional trial. Discussion. In this trial, patient characteristics and outcomes were collected from existing EHR systems, but, as usual for EHRs, they could not be standardized for date or method of measurement and required substantial time and expertise to collect and curate. Hospital admissions, the primary trial outcome, required additional effort to locate and use supplementary sources of data. Highlights Electronic health records are not a single system but a series of overlapping and legacy systems that require time and expertise to use efficiently. Commonly measured patient characteristics such as weight and body mass index are relatively easy to locate for most trial enrollees but less common characteristics, like ejection fraction, are not. Acquiring essential supplementary data—in this trial, state data on hospital admission—can be a lengthy and difficult process.

Karoliina Snell ◽  
Heta Tarkkala ◽  
Aaro Tupasela

Nordic welfare states have well institutionalised practises of gathering health and social wellbeing data from their citizens. The establishment of population registers coincided with the building of welfare state institutions and a social contract relying on solidarity. During the last decade, the significance of Nordic registers and health data has increased and they have become sources of economic value. Recent policies expect registers, health data and biobanks to attract international investments, making Nordic countries world-leaders in the global health data economy. In this article we question the conditions and boundaries of solidarity in the emerging data-driven health economy. We argue that the logics of welfare state and data-driven health economy create a paradox – the data economy is not possible without the welfare state data regime, but the logic of data-driven health economy contradicts the value bases of the welfare state data regime and therefore the justifications for data gathering and use become questionable. We develop the concept of solidarization to describe the process by which individuals are expected to behave in a solidaristic way to support data gathering and related policy processes. We demonstrate the solidarity paradox through a recent legislative and data infrastructure reform in Finland and discuss it in relation to academic literature on solidarity.

2021 ◽  
Vol 2021 ◽  
pp. 1-10
Shan Lin ◽  
Liping Liu ◽  
Meiwan Rao ◽  
Shu Deng ◽  
Jiaxin Wang ◽  

To make accurate and comprehensive evaluation of the catenary and diagnose the causes of the catenary fault, a method of catenary state evaluation and diagnosis based on the principal component analysis control chart was proposed, which can make full use of the multidimensional detection parameters of the catenary. The principal component analysis was used to reduce the dimension of catenary parameters, the principal component T2 control chart was calculated to show the change of principal component of catenary state data, the residual SPE control chart was calculated to show the change of their correlation, and the contribution rate control chart was calculated to show the cause of abnormal state data. The method can not only transform the multidimensional detection parameters of the catenary into a statistic to realize the simple and intuitive evaluation of the catenary state but also can accurately determine the cause of the abnormal state, so as to provide technical support for the targeted condition-based maintenance of the catenary.

2021 ◽  
Vol 62 ◽  
pp. 27-37
Ramunė Vaišnorė ◽  
Audronė Jakaitienė

Currently the world is threatened by a global COVID-19 pandemic and it has induced crisis creating a lot of disruptions in the healthcare system, social life and economy. In this article we present the analysis of COVID-19 situation in Lithuania and it's municipalities taking into consideration the effect of non-pharmaceutical interventions on the reproduction number. We have analysed the period from 20/03/2020 to 20/06/2021 covering two quarantines applied in Lithuania. We calculated the reproduction number using the incidence data provided by State Data Governance Information System, while the information for applied non-pharmaceutical interventions was extracted from Oxford COVID-19 Government Response Tracker and the COVID-19 website of Government of the Republic of Lithuania. The positive effect of applied non-pharmaceutical interventions on reproduction number was observed when internal movement ban was applied in 16/12/2020 during the second quarantine in Lithuania.

2021 ◽  
Vol 44 (2) ◽  
pp. 153-169
Aurimas Šidlauskas

The implementation of the EU General Data Protection Regulation (hereinafter referred to as the Regulation), which, among other things, aims to eliminate disparities between national systems and to alleviate unnecessary administrative burdens, began on 25 May 2018. Each Member State is to ensure that there is one or more independent public authorities (hereinafter referred to as the supervisory authority) responsible for monitoring the implementation of the Regulation. In Lithuania, personal data protection is supervised by two authorities, namely by the State Data Protection Inspectorate (hereinafter referred to as the SDPI) and by the Office of the Inspector of Journalist Ethics. The powers conferred on the supervisory authorities by the Regulation are greater and broader in scope than those granted under previous data protection legislation. Organizations which process personal data must ensure compliance with the requirements laid down in the Regulation. A supervisory authority that violates the provisions of the Regulation may be faced with heavy administrative fines and other sanctions. This article analyzes the practice of imposing administrative fines in the EU and in Lithuania as compared to other EU Member States. The author of the article believes that evaluating the practice of imposing administrative fines by the SDPI within the general context of the EU shall enable one to search for the reasons behind the current situation, as well as to improve the processes the SDPI employs to perform functions associated with data protection supervision. The article uses generalization and comparative analysis of scientific literature, legal documents and statistical data.

2021 ◽  
Vol 268 ◽  
pp. 540-545
Andrew Joseph Young ◽  
Elinore Kaufman ◽  
Allison Hare ◽  
Madhu Subramanian ◽  
Jane Keating ◽  

2021 ◽  
Vol 15 ◽  
Karun Thanjavur ◽  
Dionissios T. Hristopulos ◽  
Arif Babul ◽  
Kwang Moo Yi ◽  
Naznin Virji-Babul

Artificial neural networks (ANNs) are showing increasing promise as decision support tools in medicine and particularly in neuroscience and neuroimaging. Recently, there has been increasing work on using neural networks to classify individuals with concussion using electroencephalography (EEG) data. However, to date the need for research grade equipment has limited the applications to clinical environments. We recently developed a deep learning long short-term memory (LSTM) based recurrent neural network to classify concussion using raw, resting state data using 64 EEG channels and achieved high accuracy in classifying concussion. Here, we report on our efforts to develop a clinically practical system using a minimal subset of EEG sensors. EEG data from 23 athletes who had suffered a sport-related concussion and 35 non-concussed, control athletes were used for this study. We tested and ranked each of the original 64 channels based on its contribution toward the concussion classification performed by the original LSTM network. The top scoring channels were used to train and test a network with the same architecture as the previously trained network. We found that with only six of the top scoring channels the classifier identified concussions with an accuracy of 94%. These results show that it is possible to classify concussion using raw, resting state data from a small number of EEG sensors, constituting a first step toward developing portable, easy to use EEG systems that can be used in a clinical setting.

2021 ◽  
Vol 2125 (1) ◽  
pp. 012033
Yongbo Cheng ◽  
Jianzhong Hong ◽  
Xing Fu ◽  
Dianchen Zheng ◽  
Jianquan Zhang

Abstract There are many parameters which could reflect the operating state of geotechnical centrifuge. However, only one parameter is detected generally ; this is insuficient and unsafe for the running of the geotechnical centrifuge. This paper put forward an auto-running sate monitoring method which based on the multi-parameters’ weighted data fusion. The way by multi-sensor acquirring the running state data of the geotechnical centrifuge, then processing the data with weighted data fusion could produce the comprehensive running state parameter, which feed forward to the control system to keep the equipment running in a safe manner. The method in this paper could be implemented automatically and the result for safety monitroing is sufficient, the effect is much more efficient.

2021 ◽  
Hamze Dokoohaki ◽  
Bailey D. Morrison ◽  
Ann Raiho ◽  
Shawn P. Serbin ◽  
Michael Dietze

Abstract. The ability to monitor, understand, and predict the dynamics of the terrestrial carbon cycle requires the capacity to robustly and coherently synthesize multiple streams of information that each provide partial information about different pools and fluxes. In this study, we introduce a new terrestrial carbon cycle data assimilation system, built on the PEcAn model-data eco-informatics system, and its application for the development of a proof-of-concept carbon "reanalysis" product that harmonizes carbon pools (leaf, wood, soil) and fluxes (GPP, Ra, Rh, NEE) across the contiguous United States from 1986–2019. We first calibrated this system against plant trait and flux tower Net Ecosystem Exchange (NEE) using a novel emulated hierarchical Bayesian approach. Next, we extended the Tobit-Wishart Ensemble Filter (TWEnF) State Data Assimilation (SDA) framework, a generalization of the common Ensemble Kalman Filter which accounts for censored data and provides a fully Bayesian estimate of model process error, to a regional-scale system with a calibrated localization. Combined with additional workflows for propagating parameter, initial condition, and driver uncertainty, this represents the most complete and robust uncertainty accounting available for terrestrial carbon models. Our initial reanalysis was run on an irregular grid of ~500 points selected using a stratified sampling method to efficiently capture environmental heterogeneity. Remotely sensed observations of aboveground biomass (Landsat LandTrendr) and LAI (MODIS MOD15) were sequentially assimilated into the SIPNET model. Reanalysis soil carbon, which was indirectly constrained based on modeled covariances, showed general agreement with SoilGrids, an independent soil carbon data product. Reanalysis NEE, which was constrained based on posterior ensemble weights, also showed good agreement with eddy flux tower NEE and reduced RMSE compared to the calibrated forecast. Ultimately, PEcAn's carbon cycle reanalysis provides a scalable framework for harmonizing multiple data constraints and providing a uniform synthetic platform for carbon monitoring, reporting, and verification (MRV) and accelerating terrestrial carbon cycle research.

2021 ◽  
Vol 4 (4) ◽  
Givanildo De Gois ◽  
José Francisco De Oliveira-Júnior

The goal was to perform the filling, consistency and processing of the rainfall time series data from 1943 to 2013 in five regions of the state. Data were obtained from several sources (ANA, CPRM, INMET, SERLA and LIGHT), totaling 23 stations. The time series (raw data) showed failures that were filled with data from TRMM satellite via 3B43 product, and with the climatological normal from INMET. The 3B43 product was used from 1998 to 2013 and the climatological normal over the 1947- 1997 period. Data were submitted to descriptive and exploratory analysis, parametric tests (Shapiro-Wilks and Bartlett), cluster analysis (CA), and data processing (Box Cox) in the 23 stations. Descriptive analysis of the raw data consistency showed a probability of occurrence above 75% (high time variability). Through the CA, two homogeneous rainfall groups (G1 and G2) were defined. The group G1 and G2 represent 77.01% and 22.99% of the rainfall occurring in SRJ, respectively. Box Cox Processing was effective in stabilizing the normality of the residuals and homogeneity of variance of the monthly rainfall time series of the five regions of the state. Data from 3B43 product and the climatological normal can be used as an alternative source of quality data for gap filling.

Sign in / Sign up

Export Citation Format

Share Document