scholarly journals Semantic Enrichment of Streaming Healthcare Data

Author(s):  
Daniel Cotter ◽  
V. K. Cody Bumgardner

AbstractIn the past decade, the healthcare industry has made significant advances in the digitization of patient information. However, a lack of interoperability among healthcare systems still imposes a high cost to patients, hospitals, and insurers. Currently, most systems pass messages using idiosyncratic messaging standards that require specialized knowledge to interpret. This increases the cost of systems integration and often puts more advanced uses of data out of reach. In this project, we demonstrate how two open standards, FHIR and RDF, can be combined both to integrate data from disparate sources in real time and make that data queryable and susceptible to automated inference. To validate the effectiveness of the semantic engine, we perform simulations of real-time data feeds and demonstrate how they can be combined and used by client-side applications with no knowledge of the underlying sources.

2020 ◽  
Vol 10 (24) ◽  
pp. 9154
Author(s):  
Paula Morella ◽  
María Pilar Lambán ◽  
Jesús Royo ◽  
Juan Carlos Sánchez ◽  
Jaime Latapia

The purpose of this work is to develop a new Key Performance Indicator (KPI) that can quantify the cost of Six Big Losses developed by Nakajima and implements it in a Cyber Physical System (CPS), achieving a real-time monitorization of the KPI. This paper follows the methodology explained below. A cost model has been used to accurately develop this indicator together with the Six Big Losses description. At the same time, the machine tool has been integrated into a CPS, enhancing the real-time data acquisition, using the Industry 4.0 technologies. Once the KPI has been defined, we have developed the software that can turn these real-time data into relevant information (using Python) through the calculation of our indicator. Finally, we have carried out a case of study showing our new KPI results and comparing them to other indicators related with the Six Big Losses but in different dimensions. As a result, our research quantifies economically the Six Big Losses, enhances the detection of the bigger ones to improve them, and enlightens the importance of paying attention to different dimensions, mainly, the productive, sustainable, and economic at the same time.


2010 ◽  
Vol 31 (1) ◽  
pp. 62-81 ◽  
Author(s):  
Hazel Richards
Keyword(s):  
The Past ◽  

In this paper I analyse variation in the use of past tense be in data from Morley, a suburb of Leeds, in the North of England, using both real-time and apparent-time data. Rather than concentrating on the traditional aspects of this variable, namely alternation between was and were, I identify four phonetic variants of the past tense be system. I propose that the community under consideration are adopting intermediate variants that, both in terms of perception and production, lie between the standard (British) realisations of was [wɒz] and were [wɜː]. A reallocation process has occurred between these two intermediate forms, along the lines of polarity. The inclusion of the intermediate forms of past tense be enables us to perceive previously unobserved patterns of variation with regard to this variable.


Author(s):  
Prof. Sushma Laxman Wakchaure ◽  
Shinde Bipin Balu ◽  
Bhabad Vasant M. ◽  
Dnyandev S. Musale ◽  
Supriya S. Burhade

Blockchain is a system of recording information in a way that makes it difficult or impossible to change, hack, or cheat the system. You have probably heard of Blockchain in the past few years, mostly in the context of crypto currency. However, Blockchain has grown to have several different applications. The significant part about Blockchain is that it is never under the complete control of a single entity due to being entirely consensus-driven. It can never change the data you store in the Blockchain used widely in sharing medical data in the healthcare industry. Due to the security that Blockchain provides, this data can be shared among parties seamlessly. Another application of Blockchain is in maintaining the integrity of payment systems. Blockchain-based payment systems are immune to external attacks and theft. Blockchain can also be used in tracking the status of products in a supply chain in real-time.


Symmetry ◽  
2020 ◽  
Vol 12 (8) ◽  
pp. 1307
Author(s):  
Duansen Shangguan ◽  
Liping Chen ◽  
Jianwan Ding

The ever-increasing functional density and complexity of the satellite systems, the harsh space flight environment, as well as the cost reduction measures that require less operator involvement are increasingly driving the need to develop new approaches for fault diagnosis and health monitoring (FD-HM). The data-driven FD-HM approaches use signal processing or data mining to obtain implicit information for the operating state of the system, which is good at monitoring systems extensively and shallowly and is expected to reduce the workload of the operators. However, these approaches for the FD-HM of the satellite system are driven primarily by the historical data and some static physical data, with little consideration for the simulation data, real-time data, and data fusion between the two, so it is not fully competent for the real-time monitoring and maintenance of the satellite in orbit. To ensure the reliable operation of the complex satellite systems, this paper presents a new physical–virtual convergence approach, digital twin, for FD-HM. Moreover, we present an FD-HM application of the satellite power system to demonstrate the effectiveness of the proposed approach.


2011 ◽  
Vol 49 (1) ◽  
pp. 72-100 ◽  
Author(s):  
Dean Croushore

In the past ten years, researchers have explored the impact of data revisions in many different contexts. Researchers have examined the properties of data revisions, how structural modeling is affected by data revisions, how data revisions affect forecasting, the impact of data revisions on monetary policy analysis, and the use of real-time data in current analysis. This paper summarizes many of the questions for which real-time data analysis has provided answers. In addition, researchers and institutions have developed better real-time data sets around the world. Still, additional research is needed in key areas and research to date has uncovered even more fruitful areas worth exploring. (JEL C52, C53, C80, E01)


2013 ◽  
Vol 711 ◽  
pp. 629-635 ◽  
Author(s):  
Basem Almadani ◽  
Anas Al-Roubaiey ◽  
Rashad Ahmed

In recent years, there has been a growth in the amount of data used to improve the production processes. As a result, the cross communication and interaction between components has increased and became a key property of modern production system. To reduce the cost of communication and to increase the efficiency of these systems, middleware technology such as CORBA, COM+, Java RMI, and Web services are being used. Middleware is becoming more and more the essential factor in improving current Manufacturing Process Automation and Control Systems, and their applications have witnessed an increased demand in real time Manufacturing Distributed Applications. In this paper, we propose publish-subscribe middleware architecture based on the Data Distribution System (DDS) open standard. The proposed architecture aims to seamlessly integrate the existing heterogeneous manufacturing systems, such as SCADA system, DCS, PLCs, and databases; and improve the communication and real time data delivery among system components; also, it provides a QoS support layer between the communicating components. Furthermore, the proposed architecture is implemented and evaluated by intensive experiments.


Author(s):  
Gautam Pal ◽  
Katie Atkinson ◽  
Gangmin Li

AbstractThis paper presents an approach to analyzing consumers’ e-commerce site usage and browsing motifs through pattern mining and surfing behavior. User-generated clickstream is first stored in a client site browser. We build an ingestion pipeline to capture the high-velocity data stream from a client-side browser through Apache Storm, Kafka, and Cassandra. Given the consumer’s usage pattern, we uncover the user’s browsing intent through n-grams and Collocation methods. An innovative clustering technique is constructed through the Expectation-Maximization algorithm with Gaussian Mixture Model. We discuss a framework for predicting a user’s clicks based on the past click sequences through higher order Markov Chains. We developed our model on top of a big data Lambda Architecture which combines high throughput Hadoop batch setup with low latency real-time framework over a large distributed cluster. Based on this approach, we developed an experimental setup for an optimized Storm topology and enhanced Cassandra database latency to achieve real-time responses. The theoretical claims are corroborated with several evaluations in Microsoft Azure HDInsight Apache Storm deployment and in the Datastax distribution of Cassandra. The paper demonstrates that the proposed techniques help user experience optimization, building recently viewed products list, market-driven analyses, and allocation of website resources.


2008 ◽  
Vol 203 ◽  
pp. 78-90
Author(s):  
Anthony Garratt ◽  
Kevin Lee ◽  
Shaun Vahey

An overview is provided of the issues raised in the recent literature on the use of real-time data in the context of nowcasting and forecasting UK macroeconomic events. The ideas are illustrated through two specific applications using UK real-time data available over 1961-2006 and providing probability forecasts that could have been produced in real time over the past twenty years. In the first, we consider the reliability of first-release data on the components of UK aggregate demand by looking at forecasts of the probability of substantial data revisions. In the second, we consider the estimation of the output gap, illustrating the uncertainty surrounding its measurement through density forecasts and focusing on its interpretation in terms of inflationary pressure through an event probability forecast.


2020 ◽  
Vol 15 (3) ◽  
pp. 256-266 ◽  
Author(s):  
Daisuke Komori ◽  
Akiyuki Kawasaki ◽  
Nanami Sakai ◽  
Natsumi Shimomura ◽  
Akira Harada ◽  
...  

A massive flood in Myanmar struck the Bago river in July, 2018. In this study, because of the limitation of real-time data availability, the satellite-based precipitation was used for clarifying the characteristics of the flood. The total precipitation during 10 days from July 22, when the flood first began at the western Bago city, was estimated approximately 753 mm and 527 mm at the Bago and Zaungts stations in the Bago river watershed. These values were corresponding to 355% and 294% of average of the 10-day total precipitation at the Bago (1967–2015) and Zaungts (1987–2014) stations. Furthermore, not only the 3-day and weekly peak precipitations but also the annual accumulative precipitations during July 22 and August 16 were estimated larger than the largest recorded precipitations at both stations. Although the Zaungts dam stored approximately 140 million m3 during this period, which was an amount equivalent to 40% of inflow volume during July 22 and 28, the resulting flood widely propagated in the Bago city. Based on the flood survey, the 2018 Bago river flood was classified into 4 areas; the right bank of the Bago river, the eastern town, the northern town, and the downstream from the Zaungts Weir and Bago city. These areas were marked as vulnerable areas in the Bago city. The Bago river watershed has experienced many floods in the past, and floods on the same scale as this flood are expected to occur in the future. Therefore, it is essential to understand the characteristics of the 2018 Bago river flood and develop near real-time monitoring of hydrometeorological situation to be prepared for the next flood disaster.


Author(s):  
Alan S. Rudolph ◽  
Ronald R. Price

We have employed cryoelectron microscopy to visualize events that occur during the freeze-drying of artificial membranes by employing real time video capture techniques. Artificial membranes or liposomes which are spherical structures within internal aqueous space are stabilized by water which provides the driving force for spontaneous self-assembly of these structures. Previous assays of damage to these structures which are induced by freeze drying reveal that the two principal deleterious events that occur are 1) fusion of liposomes and 2) leakage of contents trapped within the liposome [1]. In the past the only way to access these events was to examine the liposomes following the dehydration event. This technique allows the event to be monitored in real time as the liposomes destabilize and as water is sublimed at cryo temperatures in the vacuum of the microscope. The method by which liposomes are compromised by freeze-drying are largely unknown. This technique has shown that cryo-protectants such as glycerol and carbohydrates are able to maintain liposomal structure throughout the drying process.


Sign in / Sign up

Export Citation Format

Share Document