scholarly journals Visualizing the quality of partially accruing data for use in decision making

2015 ◽  
Vol 7 (3) ◽  
Author(s):  
Julia Eaton ◽  
Ian Painter ◽  
Donald Olson ◽  
William Lober

Secondary use of clinical health data for near real-time public health surveillance presents challenges surrounding its utility due to data quality issues. Data used for real-time surveillance must be timely, accurate and complete if it is to be useful; if incomplete data are used for surveillance, understanding the structure of the incompleteness is necessary. Such data are commonly aggregated due to privacy concerns. The Distribute project was a near real-time influenza-like-illness (ILI) surveillance system that relied on aggregated secondary clinical health data. The goal of this work is to disseminate the data quality tools developed to gain insight into the data quality problems associated with these data. These tools apply in general to any system where aggregate data are accrued over time and were created through the end-user-as-developer paradigm. Each tool was developed during the exploratory analysis to gain insight into structural aspects of data quality. Our key finding is that data quality of partially accruing data must be studied in the context of accrual lag—the difference between the time an event occurs and the time data for that event are received, i.e. the time at which data become available to the surveillance system. Our visualization methods therefore revolve around visualizing dimensions of data quality affected by accrual lag, in particular the tradeoff between timeliness and completion, and the effects of accrual lag on accuracy.  Accounting for accrual lag in partially accruing data is necessary to avoid misleading or biased conclusions about trends in indicator values and data quality. 

2009 ◽  
Vol 6 (3) ◽  
pp. 515-524 ◽  
Author(s):  
Natasa Maksic ◽  
Petar Knezevic ◽  
Marija Antic ◽  
Aleksandra Smiljanic

The routing algorithm with load balancing presented in [1] represents the modification of OSPF protocol, which enables the optimization to achieve higher network throughput. In the case of routing with load balancing, packets belonging to the same stream use different paths in the network. This paper analyzes the influence of the difference in packet propagation times on the quality of real-time data transmission. The proposed algorithm was implemented and the simulation network was formed to measure the jitter. .


2018 ◽  
Author(s):  
Robab Abdolkhani ◽  
Kathleen Gray ◽  
Ann Borda ◽  
Ruth De Souza

BACKGROUND The proliferation of advanced wearable medical technologies is increasing the production of Patient-Generated Health Data (PGHD). However, there is lack of evidence on whether the quality of the data generated from wearables can be effectively used for patient care. In order for PGHD to be utilized for decision making by health providers, it needs to be of high quality, that is, it must comply with standards defined by health care organizations and be accurate, consistent, complete and unbiased. Although medical wearables record highly accurate data, there are other technology issues as well as human factors that affect PGHD quality when it is collected and shared under patients’ control to ultimately used by health care providers. OBJECTIVE This paper explores human factors and technology factors that impact on the quality of PGHD from medical wearables for effective use in clinical care. METHODS We conducted semi-structured interviews with 17 PGHD stakeholders in Australia, the US, and the UK. Participants include ten health care providers working with PGHD from medical wearables in diabetes, sleep disorders, and heart arrhythmia, five health IT managers, and two executives. The participants were interviewed about seven data quality dimensions including accuracy, accessibility, coherence, institutional environment, interpretability, relevancy, and timeliness. Open coding of the interview data identified several technology and human issues related to the data quality dimensions regarding the clinical use of PGHD. RESULTS The overarching technology issues mentioned by participants include lack of advanced functionalities such as real-time alerts for patients as well as complicated settings which can result in errors. In terms of PGHD coherence, different wearables have different data capture mechanisms for the same health condition that create different formats which result in difficult PGHD interpretation and comparison. Another technology issue that is relevant to the current ICT infrastructure of the health care settings is lack of possibility in real-time PGHD access by health care providers which reduce the value of PGHD use. Besides, health care providers addressed a challenge on where PGHD is stored and who truthfully owns the data that affect the feasibility of PGHD access. The human factors included a lack of digital health literacy among patients which shape both the patients’ motivation and their behaviors toward PGHD collection. For example, the gaps in data recording shown in the results indicate the wearable was not used for a time duration. Participants also identified the cost of devices as a barrier to the long-term engagement and use of wearables. CONCLUSIONS Using PGHD garnered from medical wearables is problematic in clinical contexts due to low-quality data influenced by technology and human factors. At present, no guidelines have been defined to assess PGHD quality. Hence, there is a need for new solutions to overcome the existing technology and human-related barriers to enhance PGHD quality.


2015 ◽  
Vol 2015 ◽  
pp. 1-14 ◽  
Author(s):  
Woochul Kang ◽  
Jaeyong Chung

With ubiquitous deployment of sensors and network connectivity, amounts of real-time data for embedded systems are increasing rapidly and database capability is required for many embedded systems for systematic management of real-time data. In such embedded systems, supporting the timeliness of tasks accessing databases is an important problem. However, recent multicore-based embedded architectures pose a significant challenge for such data-intensive real-time tasks since the response time of accessing data can be significantly affected by potential intercore interferences. In this paper, we propose a novel feedback control scheme that supports the timeliness of data-intensive tasks against unpredictable intercore interferences. In particular, we use multiple inputs/multiple outputs (MIMO) control method that exploits multiple control knobs, for example, CPU frequency and the Quality-of-Data (QoD) to handle highly unpredictable workloads in multicore systems. Experimental results, using actual implementation, show that the proposed approach achieves the target Quality-of-Service (QoS) goals, such as task timeliness and Quality-of-Data (QoD) while consuming less energy compared to baseline approaches.


2009 ◽  
Vol 26 (3) ◽  
pp. 556-569 ◽  
Author(s):  
Ananda Pascual ◽  
Christine Boone ◽  
Gilles Larnicol ◽  
Pierre-Yves Le Traon

Abstract The timeliness of satellite altimeter measurements has a significant effect on their value for operational oceanography. In this paper, an Observing System Experiment (OSE) approach is used to assess the quality of real-time altimeter products, a key issue for robust monitoring and forecasting of the ocean state. In addition, the effect of two improved geophysical corrections and the number of missions that are combined in the altimeter products are also analyzed. The improved tidal and atmospheric corrections have a significant effect in coastal areas (0–100 km from the shore), and a comparison with tide gauge observations shows a slightly better agreement with the gridded delayed-time sea level anomalies (SLAs) with two altimeters [Jason-1 and European Remote Sensing Satellite-2 (ERS-2)/Envisat] using the new geophysical corrections (mean square differences in percent of tide gauge variance of 35.3%) than those with four missions [Jason-1, ERS/Envisat, Ocean Topography Experiment (TOPEX)/Poseidoninterlaced, and Geosat Follow-On] but using the old corrections (36.7%). In the deep ocean, however, the correction improvements have little influence. The performance of fast delivery products versus delayed-time data is compared using independent in situ data (tide gauge and drifter data). It clearly highlights the degradation of real-time SLA maps versus the delayed-time SLA maps: four altimeters are needed in real time to get the similar quality performance as two altimeters in delayed time (sea level error misfit around 36%, and zonal and meridional velocity estimation errors of 27% and 33%, respectively). This study proves that the continuous improvement of geophysical corrections is very important, and that it is essential to stay above a minimum threshold of four available altimetric missions to capture the main space and time oceanic scales in fast delivery products.


2021 ◽  
Author(s):  
S. H. Al Gharbi ◽  
A. A. Al-Majed ◽  
A. Abdulraheem ◽  
S. Patil ◽  
S. M. Elkatatny

Abstract Due to high demand for energy, oil and gas companies started to drill wells in remote areas and unconventional environments. This raised the complexity of drilling operations, which were already challenging and complex. To adapt, drilling companies expanded their use of the real-time operation center (RTOC) concept, in which real-time drilling data are transmitted from remote sites to companies’ headquarters. In RTOC, groups of subject matter experts monitor the drilling live and provide real-time advice to improve operations. With the increase of drilling operations, processing the volume of generated data is beyond a human's capability, limiting the RTOC impact on certain components of drilling operations. To overcome this limitation, artificial intelligence and machine learning (AI/ML) technologies were introduced to monitor and analyze the real-time drilling data, discover hidden patterns, and provide fast decision-support responses. AI/ML technologies are data-driven technologies, and their quality relies on the quality of the input data: if the quality of the input data is good, the generated output will be good; if not, the generated output will be bad. Unfortunately, due to the harsh environments of drilling sites and the transmission setups, not all of the drilling data is good, which negatively affects the AI/ML results. The objective of this paper is to utilize AI/ML technologies to improve the quality of real-time drilling data. The paper fed a large real-time drilling dataset, consisting of over 150,000 raw data points, into Artificial Neural Network (ANN), Support Vector Machine (SVM) and Decision Tree (DT) models. The models were trained on the valid and not-valid datapoints. The confusion matrix was used to evaluate the different AI/ML models including different internal architectures. Despite the slowness of ANN, it achieved the best result with an accuracy of 78%, compared to 73% and 41% for DT and SVM, respectively. The paper concludes by presenting a process for using AI technology to improve real-time drilling data quality. To the author's knowledge based on literature in the public domain, this paper is one of the first to compare the use of multiple AI/ML techniques for quality improvement of real-time drilling data. The paper provides a guide for improving the quality of real-time drilling data.


Author(s):  
Manjunath Ramachandra ◽  
Vikas Jain

The present day Internet traffic largely caters for the multimedia traffic throwing open new and unthinkable applications such as tele-surgery. The complexity of data transactions increases with a demand for in time and real time data transfers, demanding the limited resources of the network beyond their capabilities. It requires a prioritization of data transfers, controlled dumping of data over the network etc. To make the matter worse, the data from different origin combine together imparting long lasting detrimental features such as self similarity and long range dependency in to the traffic. The multimedia data fortunately is associated with redundancies that may be removed through efficient compression techniques. There exists a provision to control the compression or bitrates based on the availability of resources in the network. The traffic controller or shaper has to optimize the quality of the transferred multimedia data depending up on the state of the network. In this chapter, a novel traffic shaper is introduced considering the adverse properties of the network and counteract with the same.


2020 ◽  
Vol 44 (5) ◽  
pp. 677
Author(s):  
Rebekah Eden ◽  
Andrew Burton-Jones ◽  
James Grant ◽  
Renea Collins ◽  
Andrew Staib ◽  
...  

Objective This study aims to assist hospitals contemplating digital transformation by assessing the reported qualitative effects of rapidly implementing an integrated eHealth system in a large Australian hospital and determining whether existing literature offers a reliable framework to assess the effects of digitisation. Methods A qualitative, single-site case study was performed using semistructured interviews supplemented by focus groups, observations and documentation. In all, 92 individuals across medical, nursing, allied health, administrative and executive roles provided insights into the eHealth system, which consisted of an electronic medical record, computerised decision support, computerised physician order entry, ePrescribing systems and wireless device integration. These results were compared against a known framework of the effects of hospital digitisation. Results Diverse, mostly positive, effects were reported, largely consistent with existing literature. Several new effects not reported in literature were reported, namely: (1) improvements in accountability for care, individual career development and time management; (2) mixed findings for the availability of real-time data; and (3) positive findings for the secondary use of data. Conclusions The overall positive perceptions of the effects of digitisation should give confidence to health services contemplating rapid digital transformation. Although existing literature provides a reliable framework for impact assessment, new effects are still emerging, and research and practice need to shift towards understanding how clinicians and hospitals can maximise the benefits of digital transformation. What is known about the topic? Hospitals outside the US are increasingly becoming engaged in eHealth transformations. Yet, the reported effects of these technologies are diverse and mixed with qualitative effects rarely reported. What does this paper add? This study provides a qualitative assessment of the effects of an eHealth transformation at a large Australian tertiary hospital. The results provide renewed confidence in the literature because the findings are largely consistent with expectations from prior systematic reviews of impacts. The qualitative approach followed also resulted in the identification of new effects, which included improvements in accountability, time management and individual development, as well as mixed results for real-time data. In addition, substantial improvements in patient outcomes and clinician productivity were reported from the secondary use of data within the eHealth systems. What are the implications for practitioners? The overall positive findings in this large case study should give confidence to other health services contemplating rapid digital transformation. To achieve substantial benefits, hospitals need to understand how they can best leverage the data within these systems to improve the quality and efficiency of patient care. As such, both research and practice need to shift towards understanding how these systems can be used more effectively.


Sign in / Sign up

Export Citation Format

Share Document