scholarly journals Design and Implementation of an Informatics Infrastructure for Standardized Data Acquisition, Transfer, Storage, and Export in Psychiatric Clinical Routine: Feasibility Study

10.2196/26681 ◽  
2021 ◽  
Vol 8 (6) ◽  
pp. e26681
Author(s):  
Rogério Blitz ◽  
Michael Storck ◽  
Bernhard T Baune ◽  
Martin Dugas ◽  
Nils Opel

Background Empirically driven personalized diagnostic applications and treatment stratification is widely perceived as a major hallmark in psychiatry. However, databased personalized decision making requires standardized data acquisition and data access, which are currently absent in psychiatric clinical routine. Objective Here, we describe the informatics infrastructure implemented at the psychiatric Münster University Hospital, which allows standardized acquisition, transfer, storage, and export of clinical data for future real-time predictive modelling in psychiatric routine. Methods We designed and implemented a technical architecture that includes an extension of the electronic health record (EHR) via scalable standardized data collection and data transfer between EHRs and research databases, thus allowing the pooling of EHRs and research data in a unified database and technical solutions for the visual presentation of collected data and analyses results in the EHR. The Single-source Metadata ARchitecture Transformation (SMA:T) was used as the software architecture. SMA:T is an extension of the EHR system and uses module-driven engineering to generate standardized applications and interfaces. The operational data model was used as the standard. Standardized data were entered on iPads via the Mobile Patient Survey (MoPat) and the web application Mopat@home, and the standardized transmission, processing, display, and export of data were realized via SMA:T. Results The technical feasibility of the informatics infrastructure was demonstrated in the course of this study. We created 19 standardized documentation forms with 241 items. For 317 patients, 6451 instances were automatically transferred to the EHR system without errors. Moreover, 96,323 instances were automatically transferred from the EHR system to the research database for further analyses. Conclusions In this study, we present the successful implementation of the informatics infrastructure enabling standardized data acquisition and data access for future real-time predictive modelling in clinical routine in psychiatry. The technical solution presented here might guide similar initiatives at other sites and thus help to pave the way toward future application of predictive models in psychiatric clinical routine.

2020 ◽  
Author(s):  
Rogério Blitz ◽  
Michael Storck ◽  
Bernhard T Baune ◽  
Martin Dugas ◽  
Nils Opel

BACKGROUND Empirically driven personalized diagnostic applications and treatment stratification is widely perceived as a major hallmark in psychiatry. However, databased personalized decision making requires standardized data acquisition and data access, which are currently absent in psychiatric clinical routine. OBJECTIVE Here, we describe the informatics infrastructure implemented at the psychiatric Münster University Hospital, which allows standardized acquisition, transfer, storage, and export of clinical data for future real-time predictive modelling in psychiatric routine. METHODS We designed and implemented a technical architecture that includes an extension of the electronic health record (EHR) via scalable standardized data collection and data transfer between EHRs and research databases, thus allowing the pooling of EHRs and research data in a unified database and technical solutions for the visual presentation of collected data and analyses results in the EHR. The Single-source Metadata ARchitecture Transformation (SMA:T) was used as the software architecture. SMA:T is an extension of the EHR system and uses module-driven engineering to generate standardized applications and interfaces. The operational data model was used as the standard. Standardized data were entered on iPads via the Mobile Patient Survey (MoPat) and the web application Mopat@home, and the standardized transmission, processing, display, and export of data were realized via SMA:T. RESULTS The technical feasibility of the informatics infrastructure was demonstrated in the course of this study. We created 19 standardized documentation forms with 241 items. For 317 patients, 6451 instances were automatically transferred to the EHR system without errors. Moreover, 96,323 instances were automatically transferred from the EHR system to the research database for further analyses. CONCLUSIONS In this study, we present the successful implementation of the informatics infrastructure enabling standardized data acquisition and data access for future real-time predictive modelling in clinical routine in psychiatry. The technical solution presented here might guide similar initiatives at other sites and thus help to pave the way toward future application of predictive models in psychiatric clinical routine.


2020 ◽  
Author(s):  
R. Blitz ◽  
M. Storck ◽  
B.T. Baune ◽  
M. Dugas ◽  
N. Opel

AbstractBackgroundEmpirically driven personalized diagnostic and treatment is widely perceived as a major hallmark in psychiatry. However, databased personalized decision making requires standardized data acquisition and data access, which is currently absent in psychiatric clinical routine.ObjectiveHere we describe the informatics infrastructure implemented at the psychiatric university hospital Münster allowing for standardized acquisition, transfer, storage and export of clinical data for future real-time predictive modelling in psychiatric routine.MethodsWe designed and implemented a technical architecture that includes an extension of the EHR via scalable standardized data collection, data transfer between EHR and research databases thus allowing to pool EHR and research data in a unified database and technical solutions for the visual presentation of collected data and analyses results in the EHR. The Single-source Metadata ARchitecture Transformation (SMA:T) was used as the software architecture. SMA:T is an extension of the EHR system and uses Module Driven Software Development to generate standardized applications and interfaces. The Operational Data Model (ODM) was used as the standard. Standardized data was entered on iPads via the Mobile Patient Survey (MoPat) and the web application Mopat@home, the standardized transmission, processing, display and export of data was realized via SMA:T.ResultsThe technical feasibility was demonstrated in the course of the study. 19 standardized documentation forms with 241 items were created. In 317 patients, 6,451 instances were automatically transferred to the EHR system without errors. 96,323 instances were automatically transferred from the EHR system to the research database for further analyses.ConclusionsWith the present study, we present the successful implementation of the informatics infrastructure enabling standardized data acquisition, and data access for future real-time predictive modelling in clinical routine in psychiatry. The technical solution presented here might guide similar initiatives at other sites and thus help to pave the way towards future application of predictive models in psychiatric clinical routine.


2020 ◽  
Author(s):  
Martin Kohler ◽  
Mahnaz Fekri ◽  
Andreas Wieser ◽  
Jan Handwerker

<p>KITcube (Kalthoff et al, 2013) is a mobile advanced integrated observation system for the measurement of meteorological processes within a volume of 10x10x10 km<sup>3</sup>. A large variety of different instruments from in-situ sensors to scanning remote sensing devices are deployed during campaigns. The simultaneous operation and real time instrument control needed for maximum instrument synergy requires a real-time data management designed to cover the various user needs: Save data acquisition, fast loading, compressed storage, easy data access, monitoring and data exchange. Large volumes of data such as raw and semi-processed data of various data types, from simple ASCII time series to high frequency multi-dimensional binary data provide abundant information, but makes the integration and efficient management of such data volumes to a challenge.<br>Our data processing architecture is based on open source technologies and involves the following five sections: 1) Transferring: Data and metadata collected during a campaign are stored on a file server. 2) Populating the database: A relational database is used for time series data and a hybrid database model for very large, complex, unstructured data. 3) Quality control: Automated checks for data acceptance and data consistency. 4) Monitoring: Data visualization in a web-application. 5) Data exchange: Allows the exchange of observation data and metadata in specified data formats with external users.<br>The implemented data architecture and workflow is illustrated in this presentation using data from the MOSES project (http://moses.eskp.de/home).</p><p>References:</p><p><strong>KITcube - A mobile observation platform for convection studies deployed during HyMeX </strong>.<br>Kalthoff, N.; Adler, B.; Wieser, A.; Kohler, M.; Träumner, K.; Handwerker, J.; Corsmeier, U.; Khodayar, S.; Lambert, D.; Kopmann, A.; Kunka, N.; Dick, G.; Ramatschi, M.; Wickert, J.; Kottmeier, C.<br>2013. Meteorologische Zeitschrift, 22 (6), 633–647. doi:10.1127/0941-2948/2013/0542 </p>


2021 ◽  
Vol 1 (1) ◽  
Author(s):  
E. Bertino ◽  
M. R. Jahanshahi ◽  
A. Singla ◽  
R.-T. Wu

AbstractThis paper addresses the problem of efficient and effective data collection and analytics for applications such as civil infrastructure monitoring and emergency management. Such problem requires the development of techniques by which data acquisition devices, such as IoT devices, can: (a) perform local analysis of collected data; and (b) based on the results of such analysis, autonomously decide further data acquisition. The ability to perform local analysis is critical in order to reduce the transmission costs and latency as the results of an analysis are usually smaller in size than the original data. As an example, in case of strict real-time requirements, the analysis results can be transmitted in real-time, whereas the actual collected data can be uploaded later on. The ability to autonomously decide about further data acquisition enhances scalability and reduces the need of real-time human involvement in data acquisition processes, especially in contexts with critical real-time requirements. The paper focuses on deep neural networks and discusses techniques for supporting transfer learning and pruning, so to reduce the times for training the networks and the size of the networks for deployment at IoT devices. We also discuss approaches based on machine learning reinforcement techniques enhancing the autonomy of IoT devices.


J ◽  
2021 ◽  
Vol 4 (2) ◽  
pp. 147-153
Author(s):  
Paula Morella ◽  
María Pilar Lambán ◽  
Jesús Antonio Royo ◽  
Juan Carlos Sánchez

Among the new trends in technology that have emerged through the Industry 4.0, Cyber Physical Systems (CPS) and Internet of Things (IoT) are crucial for the real-time data acquisition. This data acquisition, together with its transformation in valuable information, are indispensable for the development of real-time indicators. Moreover, real-time indicators provide companies with a competitive advantage over the competition since they enhance the calculus and speed up the decision-making and failure detection. Our research highlights the advantages of real-time data acquisition for supply chains, developing indicators that would be impossible to achieve with traditional systems, improving the accuracy of the existing ones and enhancing the real-time decision-making. Moreover, it brings out the importance of integrating technologies 4.0 in industry, in this case, CPS and IoT, and establishes the main points for a future research agenda of this topic.


Author(s):  
Cheyma BARKA ◽  
Hanen MESSAOUDI-ABID ◽  
Houda BEN ATTIA SETTHOM ◽  
Afef BENNANI-BEN ABDELGHANI ◽  
Ilhem SLAMA-BELKHODJA ◽  
...  

2021 ◽  
Vol 1768 (1) ◽  
pp. 012017
Author(s):  
K Burhanudin ◽  
M H Jusoh ◽  
Z I Abdul Latiff ◽  
M S Suaimi ◽  
Z Ibrahim ◽  
...  

Electronics ◽  
2021 ◽  
Vol 10 (5) ◽  
pp. 627
Author(s):  
David Marquez-Viloria ◽  
Luis Castano-Londono ◽  
Neil Guerrero-Gonzalez

A methodology for scalable and concurrent real-time implementation of highly recurrent algorithms is presented and experimentally validated using the AWS-FPGA. This paper presents a parallel implementation of a KNN algorithm focused on the m-QAM demodulators using high-level synthesis for fast prototyping, parameterization, and scalability of the design. The proposed design shows the successful implementation of the KNN algorithm for interchannel interference mitigation in a 3 × 16 Gbaud 16-QAM Nyquist WDM system. Additionally, we present a modified version of the KNN algorithm in which comparisons among data symbols are reduced by identifying the closest neighbor using the rule of the 8-connected clusters used for image processing. Real-time implementation of the modified KNN on a Xilinx Virtex UltraScale+ VU9P AWS-FPGA board was compared with the results obtained in previous work using the same data from the same experimental setup but offline DSP using Matlab. The results show that the difference is negligible below FEC limit. Additionally, the modified KNN shows a reduction of operations from 43 percent to 75 percent, depending on the symbol’s position in the constellation, achieving a reduction 47.25% reduction in total computational time for 100 K input symbols processed on 20 parallel cores compared to the KNN algorithm.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Jennifer Zehner ◽  
Anja Røyne ◽  
Pawel Sikorski

AbstractBiocementation is commonly based on microbial-induced carbonate precipitation (MICP) or enzyme-induced carbonate precipitation (EICP), where biomineralization of $$\text {CaCO}_{3}$$ CaCO 3 in a granular medium is used to produce a sustainable, consolidated porous material. The successful implementation of biocementation in large-scale applications requires detailed knowledge about the micro-scale processes of $$\text {CaCO}_{3}$$ CaCO 3 precipitation and grain consolidation. For this purpose, we present a microscopy sample cell that enables real time and in situ observations of the precipitation of $$\text {CaCO}_{3}$$ CaCO 3 in the presence of sand grains and calcite seeds. In this study, the sample cell is used in combination with confocal laser scanning microscopy (CLSM) which allows the monitoring in situ of local pH during the reaction. The sample cell can be disassembled at the end of the experiment, so that the precipitated crystals can be characterized with Raman microspectroscopy and scanning electron microscopy (SEM) without disturbing the sample. The combination of the real time and in situ monitoring of the precipitation process with the possibility to characterize the precipitated crystals without further sample processing, offers a powerful tool for knowledge-based improvements of biocementation.


Sign in / Sign up

Export Citation Format

Share Document