scholarly journals Data Integrity Preservation Schemes in Smart Healthcare Systems That Use Fog Computing Distribution

Electronics ◽  
2021 ◽  
Vol 10 (11) ◽  
pp. 1314
Author(s):  
Abdulwahab Alazeb ◽  
Brajendra Panda ◽  
Sultan Almakdi ◽  
Mohammed Alshehri

The volume of data generated worldwide is rapidly growing. Cloud computing, fog computing, and the Internet of Things (IoT) technologies have been adapted to compute and process this high data volume. In coming years information technology will enable extensive developments in the field of healthcare and offer health care providers and patients broadened opportunities to enhance their healthcare experiences and services owing to heightened availability and enriched services through real-time data exchange. As promising as these technological innovations are, security issues such as data integrity and data consistency remain widely unaddressed. Therefore, it is important to engineer a solution to these issues. Developing a damage assessment and recovery control model for fog computing is critical. This paper proposes two models for using fog computing in healthcare: one for private fog computing distribution and one for public fog computing distribution. For each model, we propose a unique scheme to assess the damage caused by malicious attack, to accurately identify affected transactions and recover damaged data if needed. A transaction-dependency graph technique is used for both models to observe and monitor all transactions in the whole system. We conducted a simulation study to assess the applicability and efficacy of the proposed models. The evaluation rendered these models practicable and effective.


2020 ◽  
Author(s):  
Palash Sharma ◽  
Robert N. Montgomery ◽  
Rasinio S. Graves ◽  
Kayla Meyer ◽  
Suzanne L Hunt ◽  
...  

Abstract Background: The University of Kansas Alzheimer’s Disease Center (KU ADC) maintains several large databases to track participant recruitment, enrollment, and capture various research related activities. It is challenging to manage and coordinate all the research related activities. One of the crucial activities involves capturing the data and maintaining high data quality, which ensures data reusability and reproducibility.Methods: To effectively manage the cohort, the KU ADC utilizes a combination of open source Electronic Data Capture (EDC) (i.e. REDCap), along with other homegrown data management and visualization systems developed using R-studio and Shiny.Results: In this manuscript, we describe the method and utility of the user-friendly dashboard that was developed for the rapid reporting of dementia evaluations along with data visualization, which allows clinical researchers to summarize recruitment metrics, automatically generate letters to both participants and health care providers, and depict other key metrics, which ultimately help optimize workflows. Conclusions: We believe this general framework would be beneficial to any institution for capturing and maintaining similar longitudinal databases for reporting and summarizing key metrics pertaining to their research.



2020 ◽  
Author(s):  
Jennifer Dickman Portz ◽  
Kelsey Lynett Ford ◽  
Kira Elsbernd ◽  
Christopher E Knoepke ◽  
Kelsey Flint ◽  
...  

BACKGROUND Many mobile health (mHealth) technologies exist for patients with heart failure (HF). However, HF mhealth lacks evidence of efficacy, caregiver involvement, and clinically useful real-time data. OBJECTIVE We aim to capture health care providers’ perceived value of HF mHealth, particularly for pairing patient–caregiver-generated data with clinical intervention to inform the design of future HF mHealth. METHODS This study is a subanalysis of a larger qualitative study based on interviewing patients with HF, their caregivers, and health care providers. This analysis included interviews with health care providers (N=20), focusing on their perceived usefulness of HF mHealth tools and interventions. RESULTS A total of 5 themes emerged: (1) bio-psychosocial-spiritual monitoring, (2) use of sensors, (3) interoperability, (4) data sharing, and (5) usefulness of patient-reported outcomes in practice. Providers remain interested in mHealth technologies for HF patients and their caregivers. However, providers report being unconvinced of the clinical usefulness of robust real-time patient-reported outcomes. CONCLUSIONS The use of assessments, sensors, and real-time data collection could provide value in patient care. Future research must continually explore how to maximize the utility of mHealth for HF patients, their caregivers, and health care providers.



Software engineering has been used by software vendors and consultants in the development of quality health care applications ranging from electronic medical systems, patient record management applications to medical middleware devices. As a discipline, it has evolved over the last decade in the production of high-quality software across many industries. Healthcare applications demand unique expertise tailored to best project methodologies and software development models. Many health care providers argue that best software practices and usercentered design principles are vital to producing quality applications across all domains. Lack of focus on systematic software development process increases flaws in the implementation causing a loss regarding quality, cost, and trust. The survey paper seeks to analyze existing models in the area of Software Engineering and to propose best SDLC model for Smart Healthcare applications which focuses on quality improvement. This survey also includes identifying the research challenges of software engineering for smart applications



2020 ◽  
Author(s):  
Jurgen Bosch ◽  
Austin Wilson ◽  
Karthik O'Neil ◽  
Pater A Zimmerman

Background Given the global public health importance of the COVID-19 pandemic, data comparisons that predict on-going infection and mortality trends across national, state and county-level administrative jurisdictions are vitally important. We have designed a COVID-19 dashboard with the goal of providing concise sets of summarized data presentations to simplify interpretation of basic statistics and location-specific current and short-term future risks of infection. Methods We perform continuous collection and analyses of publicly available data accessible through the COVID-19 dashboard hosted at Johns Hopkins University (JHU github). Additionally, we utilize the accumulation of cases and deaths to provide dynamic 7-day short-term predictions on these outcomes across these national, state and county administrative levels. Findings COVID-19Predict produces 2,100 daily predictions [or calculations] on the state level (50 States x3 models x7 days x2 cases and deaths) and 131,964 (3,142 Counties x3 models x7 days x2 cases and deaths) on the county level. To assess how robust our models have performed in making short-term predictions over the course of the pandemic, we used available case data for all 50 U.S. states spanning the period January 20 - August 16 2020 in a retrospective analysis. Results showed a 3.7% to -0.2% mean error of deviation from the actual case predictions to date. Interpretation Our transparent methods and admin-level visualizations provide real-time data reporting and forecasts related to on-going COVID-19 transmission allowing viewers (individuals, health care providers, public health practitioners and policy makers) to develop their own perspectives and expectations regarding public life activity decisions.



2021 ◽  
Author(s):  
Palash Sharma ◽  
Robert N. Montgomery ◽  
Rasinio S. Graves ◽  
Kayla Meyer ◽  
Suzanne L Hunt ◽  
...  

Abstract Background: The University of Kansas Alzheimer’s Disease Center (KU ADC) maintains several large databases to track participant recruitment, enrollment, and capture various research related activities. It is challenging to manage and coordinate all the research related activities. One of the crucial activities involves capturing the data and maintaining high data quality, which ensures data reusability and reproducibility.Methods: To effectively manage the cohort, the KU ADC utilizes a combination of open source Electronic Data Capture (EDC) (i.e. REDCap), along with other homegrown data management and visualization systems developed using R-studio and Shiny.Results: In this manuscript, we describe the method and utility of the user-friendly dashboard that was developed for the rapid reporting of dementia evaluations along with data visualization, which allows clinical researchers to summarize recruitment metrics, automatically generate letters to both participants and health care providers, and depict other key metrics, which ultimately help optimize workflows. Conclusions: We believe this general framework would be beneficial to any institution for capturing and maintaining similar longitudinal databases for reporting and summarizing key metrics pertaining to their research.



2021 ◽  
Vol 3 ◽  
Author(s):  
Mostafa Kamal Mallick ◽  
Sarah Biser ◽  
Aathira Haridas ◽  
Vaishnavi Umesh ◽  
Olaf Tönsing ◽  
...  

The world of healthcare constantly aims to improve the lives of people while nurturing their health and comfort. Digital health and wearable technologies are aimed at making this possible. However, there are numerous factors that need to be addressed such as aging, disabilities, and health hazards. These factors are intensified in palliative care (PC) patients and limited hospital capacities make it challenging for health care providers (HCP) to handle the crisis. One of the most common symptoms reported by PC patients with severe conditions is dyspnoea. Monitoring devices with sufficient comfort could improve symptom control of patients with dyspnoea in PC. In this article, we discuss the proof-of-concept study to investigate a smart patch (SP), which monitors the pulmonary parameters: (a) breathing rate (BR) and inspiration to expiration ratio (I:E); markers for distress: (b) heart rate (HR) and heart rate variability (HRV), and (c) transmits real-time data securely to an adaptable user interface, primarily geared for palliative HCP but scalable to specific needs. The concept is verified by measuring and analyzing physiological signals from different electrode positions on the chest and comparing the results achieved with the gold standard Task Force Monitor (TFM).



2018 ◽  
Vol 09 (01) ◽  
pp. 205-220 ◽  
Author(s):  
Steven Lane ◽  
Holly Miller ◽  
Elizabeth Ames ◽  
Lawrence Garber ◽  
David Kibbe ◽  
...  

Background Secure clinical messaging and document exchange utilizing the Direct Protocol (Direct interoperability) has been widely implemented in health information technology (HIT) applications including electronic health records (EHRs) and by health care providers and organizations in the United States. While Direct interoperability has allowed clinicians and institutions to satisfy regulatory requirements and has facilitated communication and electronic data exchange as patients transition across care environments, feature and function enhancements to HIT implementations of the Direct Protocol are required to optimize the use of this technology. Objective To describe and address this gap, we developed a prioritized list of recommended features and functions desired by clinicians to utilize Direct interoperability for improved quality, safety, and efficiency of patient care. This consensus statement is intended to inform policy makers and HIT vendors to encourage further development and implementation of system capabilities to improve clinical care. Methods An ad hoc group of interested clinicians came together under the auspices of DirectTrust to address challenges of usability and create a consensus recommendation. This group drafted a list of desired features and functions that was published online. Comments were solicited from interested parties including clinicians, EHR and other HIT vendors, and trade organizations. Resultant comments were collected, reviewed by the authors, and incorporated into the final recommendations. Results This consensus statement contains a list of 57 clinically desirable features and functions categorized and prioritized for support by policy makers, development by HIT vendors, and implementation and use by clinicians. Conclusion Fully featured, standardized implementation of Direct interoperability will allow clinicians to utilize Direct messaging more effectively as a component of HIT and EHR interoperability to improve care transitions and coordination.



2010 ◽  
Vol 24 (1) ◽  
pp. 20-25 ◽  
Author(s):  
Desmond Leddin ◽  
Ronald J Bridges ◽  
David G Morgan ◽  
Carlo Fallone ◽  
Craig Render ◽  
...  

BACKGROUND: Assessment of current wait times for specialist health services in Canada is a key method that can assist government and health care providers to plan wisely for future health needs. These data are not readily available. A method to capture wait time data at the time of consultation or procedure has been developed, which should be applicable to other specialist groups and also allows for assessment of wait time trends over intervals of years.METHODS: In November 2008, gastroenterologists across Canada were asked to complete a questionnaire (online or by fax) that included personal demographics and data from one week on at least five consecutive new consultations and five consecutive procedure patients who had not previously undergone a procedure for the same indication. Wait times were collected for 18 primary indications and results were then compared with similar survey data collected in 2005.RESULTS: The longest wait times observed were for screening colonoscopy (201 days) and surveillance of previous colon cancer or polyps (272 days). The shortest wait times were for cancer-likely based on imaging or physical examination (82 days), severe or rapidly progressing dysphagia or odynophagia (83 days), documented iron-deficiency anemia (90 days) and dyspepsia with alarm symptoms (99 days). Compared with 2005 data, total wait times in 2008 were lengthened overall (127 days versus 155 days; P<0.05) and for most of the seven individual indications that permitted data comparison.CONCLUSION: Median wait times for gastroenterology services continue to exceed consensus conference recommended targets and have significantly worsened since 2005.



Author(s):  
Karamo Kanagi ◽  
Cooper Cheng-Yuan Ku ◽  
Li-Kai Lin ◽  
Wen-Huai Hsieh

Abstract Background While electronic health records have been collected for many years in Taiwan, their interoperability across different health care providers has not been entirely achieved yet. The exchange of clinical data is still inefficient and time consuming. Objectives This study proposes an efficient patient-centric framework based on the blockchain technology that makes clinical data accessible to patients and enable transparent, traceable, secure, and effective data sharing between physicians and other health care providers. Methods Health care experts were interviewed for the study, and medical data were collected in collaboration with Ministry of Health and Welfare (MOHW) Chang-Hua hospital. The proposed framework was designed based on the detailed analysis of this information. The framework includes smart contracts in an Ethereum-based permissioned blockchain to secure and facilitate clinical data exchange among different parties such as hospitals, clinics, patients, and other stakeholders. In addition, the framework employs the Logical Observation Identifiers Names and Codes (LOINC) standard to ensure the interoperability and reuse of clinical data. Results The prototype of the proposed framework was deployed in Chang-Hua hospital to demonstrate the sharing of health examination reports with many other clinics in suburban areas. The framework was found to reduce the average access time to patient health reports from the existing next-day service to a few seconds. Conclusion The proposed framework can be adopted to achieve health record sharing among health care providers with higher efficiency and protected privacy compared to the system currently used in Taiwan based on the client–server architecture.



2021 ◽  
Vol 21 (1) ◽  
Author(s):  
David Ebbevi ◽  
Henna Hasson ◽  
Knut Lönnroth ◽  
Hanna Augustsson

Abstract Background Access to health care is an essential health policy issue. In several countries, waiting time guarantees mandate set time limits for assessment and treatment. High-quality waiting time data are necessary to evaluate and improve waiting times. This study’s aim was to investigate health care providers and administrative management professionals’ perceptions of validity and usefulness of waiting time reporting in specialist care. Methods Semi-structured interviews (n = 28) were conducted with administrative management and care professionals (line managers and care providers) in specialized clinics in the Stockholm Region, Sweden. Clinic-specific data from the waiting time registry was used in the care provider interviews to assess face validity. Clinics were purposefully sampled for maximum variation in complexity of care, volume of production, geographical location, private or public ownership, and local waiting times. Thematic analysis was used. Results The waiting time registry was perceived to have low validity and usefulness. Perceived validity and usefulness were interconnected, with mechanisms that reinforced the connection. Structural and cognitive barriers to validity included technical and procedural errors, errors caused by role division, misinterpretation of guidelines, diverging interpretations of nonregulated cases and extensive willful manipulation of data. Conclusions We identify four misconceptions underpinning the current waiting time reporting system: passive dissemination of guidelines is sufficient as implemented, cognitive load of care providers to report waiting times is negligible, soft-law regulation and presentation of outcome data is sufficient to drive improvement, and self-reported data linked to incentives poses a low risk of data corruption. To counter low validity and usefulness, we propose the following for policy makers and administrative management when developing and implementing waiting time monitoring: communicate guidelines with instructions for operationalization, address barriers to implementation, ensure quality through monitoring of implementation and adherence to guidelines, develop IT ontology together with professionals, avoid parallel measurement infrastructures, ensure waiting times are presented to suit management needs, provide timely waiting time data, enable the study of single cases, minimize manual data entry, and perform spot-checks or external validity checks. Several of these strategies should be transferable to waiting time monitoring in other contexts.



Sign in / Sign up

Export Citation Format

Share Document