scholarly journals Usage of Publicly Available Software for Epidemiological Trends Modelling

Author(s):  
M. Dunaievskyi ◽  
O. Lefterov ◽  
V. Bolshakov

Introduction. Outbreaks of infectious diseases and the COVID-19 pandemic in particular pose a serious public health challenge. The other side of the challenge is always opportunity, and today such opportunities are information technology, decision making systems, best practices of proactive management and control based on modern methods of data analysis (data driven decision making) and modeling. The article reviews the prospects for the use of publicly available software in modeling epidemiological trends. Strengths and weaknesses, main characteristics and possible aspects of application are considered. The purpose of the article is to review publicly available health software. Give situations in which one or another approach will be useful. Segment and determine the effectiveness of the underlying models. Note the prospects of high-performance computing to model the spread of epidemics. Results. Although deterministic models are ready for practical use without specific additional settings, they lose comparing to other groups in terms of their functionality. To obtain evaluation results from stochastic and agentoriented models, you first need to specify the epidemic model, which requires deeper knowledge in the field of epidemiology, a good understanding of the statistical basis and the basic assumptions on which the model is based. Among the considered software, EMOD (Epidemiological MODelling software) from the Institute of Disease Modeling is a leader in functionality. Conclusions. There is a free access to a relatively wide set of software, which was originally developed by antiepidemiological institutions for internal use in decision-making, however was later opened to the public. In general, these programs have been adapted to increase their practical application. Got narrowed focus on potential issues. The possibility of adaptive use was provided. We can note the sufficient informativeness and convenience of using the software of the group of deterministic methods. Also, such models have a rather narrow functional focus. Stochastic models provide more functionality, but lose some of their ease of use. We have the maximum functionality from agentoriented models, although for their most effective use you need to have the appropriate skills to write program code. Keywords: epidemiological software, deterministic modeling, stochastic modeling, agentoriented mode-ling, high performance computing, decision making systems.

Author(s):  
Herbert Cornelius

For decades, HPC has established itself as an essential tool for discoveries, innovations and new insights in science, research and development, engineering and business across a wide range of application areas in academia and industry. Today High-Performance Computing is also well recognized to be of strategic and economic value – HPC matters and is transforming industries. This article will discuss new emerging technologies that are being developed for all areas of HPC: compute/processing, memory and storage, interconnect fabric, I/O and software to address the ongoing challenges in HPC such as balanced architecture, energy efficient high-performance, density, reliability, sustainability, and last but not least ease-of-use. Of specific interest are the challenges and opportunities for the next frontier in HPC envisioned around the 2020 timeframe: ExaFlops computing. We will also outline the new and emerging area of High Performance Data Analytics, Big Data Analytics using HPC, and discuss the emerging new delivery mechanism for HPC - HPC in the Cloud.


Author(s):  
Herbert Cornelius

For decades, HPC has established itself as an essential tool for discoveries, innovations, and new insights in science, research and development, engineering, and business across a wide range of application areas in academia and industry. Today high-performance computing is also well recognized to be of strategic and economic value – HPC matters and is transforming industries. This chapter will discuss new emerging technologies that are being developed for all areas of HPC: compute/processing, memory and storage, interconnect fabric, I/O and software to address the ongoing challenges in HPC such as balanced architecture, energy efficient high-performance, density, reliability, sustainability, and last but not least, ease of use. Of specific interest are the challenges and opportunities for the next frontier in HPC envisioned around the 2020 timeframe: ExaFlops computing. The authors also outline the new and emerging area of high-performance data analytics, big data analytics using HPC, and discuss the emerging new delivery mechanism for HPC – HPC in the Cloud.


Author(s):  
Vinay Gavirangaswamy ◽  
Aakash Gupta ◽  
Mark Terwilliger ◽  
Ajay Gupta

Research into risky decision making (RDM) has become a multidisciplinary effort. Conversations cut across fields such as psychology, economics, insurance, and marketing. This broad interest highlights the necessity for collaborative investigation of RDM to understand and manipulate the situations within which it manifests. A holistic understanding of RDM has been impeded by the independent development of diverse RDM research methodologies across different fields. There is no software specific to RDM that combines paradigms and analytical tools based on recent developments in high-performance computing technologies. This paper presents a toolkit called RDMTk, developed specifically for the study of risky decision making. RDMTk provides a free environment that can be used to manage globally-based experiments while fostering collaborative research. The incorporation of machine learning and high-performance computing (HPC) technologies in the toolkit further open additional possibilities such as scalable algorithms and big data problems arising from global scale experiments.


MRS Bulletin ◽  
1997 ◽  
Vol 22 (10) ◽  
pp. 5-6
Author(s):  
Horst D. Simon

Recent events in the high-performance computing industry have concerned scientists and the general public regarding a crisis or a lack of leadership in the field. That concern is understandable considering the industry's history from 1993 to 1996. Cray Research, the historic leader in supercomputing technology, was unable to survive financially as an independent company and was acquired by Silicon Graphics. Two ambitious new companies that introduced new technologies in the late 1980s and early 1990s—Thinking Machines and Kendall Square Research—were commercial failures and went out of business. And Intel, which introduced its Paragon supercomputer in 1994, discontinued production only two years later.During the same time frame, scientists who had finished the laborious task of writing scientific codes to run on vector parallel supercomputers learned that those codes would have to be rewritten if they were to run on the next-generation, highly parallel architecture. Scientists who are not yet involved in high-performance computing are understandably hesitant about committing their time and energy to such an apparently unstable enterprise.However, beneath the commercial chaos of the last several years, a technological revolution has been occurring. The good news is that the revolution is over, leading to five to ten years of predictable stability, steady improvements in system performance, and increased productivity for scientific applications. It is time for scientists who were sitting on the fence to jump in and reap the benefits of the new technology.


Sign in / Sign up

Export Citation Format

Share Document