scholarly journals Big Data and Policy Making: Between Real Time Management and the Experimental Dimension of Policies

Author(s):  
Grazia Concilio ◽  
Paola Pucci ◽  
Giovanni Vecchio ◽  
Giovanni Lanza
2021 ◽  
Vol 12 ◽  
Author(s):  
John A. Donaghy ◽  
Michelle D. Danyluk ◽  
Tom Ross ◽  
Bobby Krishna ◽  
Jeff Farber

Foodborne pathogens are a major contributor to foodborne illness worldwide. The adaptation of a more quantitative risk-based approach, with metrics such as Food safety Objectives (FSO) and Performance Objectives (PO) necessitates quantitative inputs from all stages of the food value chain. The potential exists for utilization of big data, generated through digital transformational technologies, as inputs to a dynamic risk management concept for food safety microbiology. The industrial revolution in Internet of Things (IoT) will leverage data inputs from precision agriculture, connected factories/logistics, precision healthcare, and precision food safety, to improve the dynamism of microbial risk management. Furthermore, interconnectivity of public health databases, social media, and e-commerce tools as well as technologies such as blockchain will enhance traceability for retrospective and real-time management of foodborne cases. Despite the enormous potential of data volume and velocity, some challenges remain, including data ownership, interoperability, and accessibility. This paper gives insight to the prospective use of big data for dynamic risk management from a microbiological safety perspective in the context of the International Commission on Microbiological Specifications for Foods (ICMSF) conceptual equation, and describes examples of how a dynamic risk management system (DRMS) could be used in real-time to identify hazards and control Shiga toxin-producing Escherichia coli risks related to leafy greens.


Healthcare ◽  
2020 ◽  
Vol 8 (3) ◽  
pp. 234 ◽  
Author(s):  
Hyun Yoo ◽  
Soyoung Han ◽  
Kyungyong Chung

Recently, a massive amount of big data of bioinformation is collected by sensor-based IoT devices. The collected data are also classified into different types of health big data in various techniques. A personalized analysis technique is a basis for judging the risk factors of personal cardiovascular disorders in real-time. The objective of this paper is to provide the model for the personalized heart condition classification in combination with the fast and effective preprocessing technique and deep neural network in order to process the real-time accumulated biosensor input data. The model can be useful to learn input data and develop an approximation function, and it can help users recognize risk situations. For the analysis of the pulse frequency, a fast Fourier transform is applied in preprocessing work. With the use of the frequency-by-frequency ratio data of the extracted power spectrum, data reduction is performed. To analyze the meanings of preprocessed data, a neural network algorithm is applied. In particular, a deep neural network is used to analyze and evaluate linear data. A deep neural network can make multiple layers and can establish an operation model of nodes with the use of gradient descent. The completed model was trained by classifying the ECG signals collected in advance into normal, control, and noise groups. Thereafter, the ECG signal input in real time through the trained deep neural network system was classified into normal, control, and noise. To evaluate the performance of the proposed model, this study utilized a ratio of data operation cost reduction and F-measure. As a result, with the use of fast Fourier transform and cumulative frequency percentage, the size of ECG reduced to 1:32. According to the analysis on the F-measure of the deep neural network, the model had 83.83% accuracy. Given the results, the modified deep neural network technique can reduce the size of big data in terms of computing work, and it is an effective system to reduce operation time.


Sign in / Sign up

Export Citation Format

Share Document