scholarly journals Multiscale dynamics of COVID-19 and model-based recommendations for 105 countries

Author(s):  
Jithender J. Timothy ◽  
Vijaya Holla ◽  
Günther Meschke

We analyse the dynamics of COVID-19 using computational modelling at multiple scales. For large scale analysis, we propose a 2-scale lattice extension of the classical SIR-type compartmental model with spatial interactions called the Lattice-SIRQL model. Computational simulations show that global quantifiers are not completely representative of the actual dynamics of the disease especially when mitigation measures such as quarantine and lockdown are applied. Furthermore, using real data of confirmed COVID-19 cases, we calibrate the Lattice-SIRQL model for 105 countries. The calibrated model is used to make country specific recommendations for lockdown relaxation and lockdown continuation. Finally, using an agent-based model we analyse the influence of cluster level relaxation rate and lockdown duration on disease spreading.

2019 ◽  
Vol 20 (S6) ◽  
Author(s):  
Mozhgan Kabiri Chimeh ◽  
Peter Heywood ◽  
Marzio Pennisi ◽  
Francesco Pappalardo ◽  
Paul Richmond

Abstract Background In recent years, the study of immune response behaviour using bottom up approach, Agent Based Modeling (ABM), has attracted considerable efforts. The ABM approach is a very common technique in the biological domain due to high demand for a large scale analysis tools for the collection and interpretation of information to solve biological problems. Simulating massive multi-agent systems (i.e. simulations containing a large number of agents/entities) requires major computational effort which is only achievable through the use of parallel computing approaches. Results This paper explores different approaches to parallelising the key component of biological and immune system models within an ABM model: pairwise interactions. The focus of this paper is on the performance and algorithmic design choices of cell interactions in continuous and discrete space where agents/entities are competing to interact with one another within a parallel environment. Conclusions Our performance results demonstrate the applicability of these methods to a broader class of biological systems exhibiting typical cell to cell interactions. The advantage and disadvantage of each implementation is discussed showing each can be used as the basis for developing complete immune system models on parallel hardware.


Electronics ◽  
2021 ◽  
Vol 10 (21) ◽  
pp. 2657
Author(s):  
Jibin Yin ◽  
Pengfei Zhao ◽  
Yi Zhang ◽  
Yi Han ◽  
Shuoyu Wang

The demand for large-scale analysis and research of data on trauma from modern warfare is increasing day by day, but the amount of existing data is not sufficient to meet such demand. In this study, an integrated modeling approach incorporating a war trauma severity scoring algorithm (WTSS) and deep neural networks (DNN) is proposed. First, the proposed WTSS, which uses multiple non-linear regression based on the characteristics of war trauma data and the medical evaluation by an expert panel, performed a standardized assessment of an injury and predicts its trauma consequences. Second, to generate virtual injury, based on the probability of occurrence, the injured parts, injury types, and complications were randomly sampled and combined, and then WTSS was used to assess the consequences of the virtual injury. Third, to evaluate the accuracy of the predicted injury consequences, we built a DNN classifier and then trained it with the generated data and tested it with real data. Finally, we used the Delphi method to filter out unreasonable injuries and improve data rationality. The experimental results verified that the proposed approach surpassed the traditional artificial generation methods, achieved a prediction accuracy of 84.43%, and realized large-scale and credible war trauma data augmentation.


2016 ◽  
Author(s):  
Michael Schirner ◽  
Anthony Randal McIntosh ◽  
Viktor K. Jirsa ◽  
Gustavo Deco ◽  
Petra Ritter

Brain dynamics span multiple spatial and temporal scales, from fast spiking neurons to slow fluctuations over distributed areas. No single experimental method links data across scales. Here, we bridge this gap using The Virtual Brain connectome-based modelling platform to integrate multimodal data with biophysical models and support neurophysiological inference. Simulated cell populations were linked with subject-specific white-matter connectivity estimates and driven by electroencephalography-derived electric source activity. The models were fit to subject-specific resting-state functional magnetic resonance imaging data, and overfitting was excluded using 5-fold cross-validation. Further evaluation of the models show how balancing excitation with feedback inhibition generates an inverse relationship between α-rhythms and population firing on a faster time scale and resting-state network oscillations on a slower time scale. Lastly, large-scale interactions in the model lead to the emergence of scale-free power-law spectra. Our novel findings underscore the integrative role for computational modelling to complement empirical studies.


2021 ◽  
Vol 13 (8) ◽  
pp. 1509
Author(s):  
Xikun Hu ◽  
Yifang Ban ◽  
Andrea Nascetti

Accurate burned area information is needed to assess the impacts of wildfires on people, communities, and natural ecosystems. Various burned area detection methods have been developed using satellite remote sensing measurements with wide coverage and frequent revisits. Our study aims to expound on the capability of deep learning (DL) models for automatically mapping burned areas from uni-temporal multispectral imagery. Specifically, several semantic segmentation network architectures, i.e., U-Net, HRNet, Fast-SCNN, and DeepLabv3+, and machine learning (ML) algorithms were applied to Sentinel-2 imagery and Landsat-8 imagery in three wildfire sites in two different local climate zones. The validation results show that the DL algorithms outperform the ML methods in two of the three cases with the compact burned scars, while ML methods seem to be more suitable for mapping dispersed burn in boreal forests. Using Sentinel-2 images, U-Net and HRNet exhibit comparatively identical performance with higher kappa (around 0.9) in one heterogeneous Mediterranean fire site in Greece; Fast-SCNN performs better than others with kappa over 0.79 in one compact boreal forest fire with various burn severity in Sweden. Furthermore, directly transferring the trained models to corresponding Landsat-8 data, HRNet dominates in the three test sites among DL models and can preserve the high accuracy. The results demonstrated that DL models can make full use of contextual information and capture spatial details in multiple scales from fire-sensitive spectral bands to map burned areas. Using only a post-fire image, the DL methods not only provide automatic, accurate, and bias-free large-scale mapping option with cross-sensor applicability, but also have potential to be used for onboard processing in the next Earth observation satellites.


2020 ◽  
pp. 1-26
Author(s):  
Qinwen Hu ◽  
Muhammad Rizwan Asghar ◽  
Nevil Brownlee

HTTPS refers to an application-specific implementation that runs HyperText Transfer Protocol (HTTP) on top of Secure Socket Layer (SSL) or Transport Layer Security (TLS). HTTPS is used to provide encrypted communication and secure identification of web servers and clients, for different purposes such as online banking and e-commerce. However, many HTTPS vulnerabilities have been disclosed in recent years. Although many studies have pointed out that these vulnerabilities can lead to serious consequences, domain administrators seem to ignore them. In this study, we evaluate the HTTPS security level of Alexa’s top 1 million domains from two perspectives. First, we explore which popular sites are still affected by those well-known security issues. Our results show that less than 0.1% of HTTPS-enabled servers in the measured domains are still vulnerable to known attacks including Rivest Cipher 4 (RC4), Compression Ratio Info-Leak Mass Exploitation (CRIME), Padding Oracle On Downgraded Legacy Encryption (POODLE), Factoring RSA Export Keys (FREAK), Logjam, and Decrypting Rivest–Shamir–Adleman (RSA) using Obsolete and Weakened eNcryption (DROWN). Second, we assess the security level of the digital certificates used by each measured HTTPS domain. Our results highlight that less than 0.52% domains use the expired certificate, 0.42% HTTPS certificates contain different hostnames, and 2.59% HTTPS domains use a self-signed certificate. The domains we investigate in our study cover 5 regions (including ARIN, RIPE NCC, APNIC, LACNIC, and AFRINIC) and 61 different categories such as online shopping websites, banking websites, educational websites, and government websites. Although our results show that the problem still exists, we find that changes have been taking place when HTTPS vulnerabilities were discovered. Through this three-year study, we found that more attention has been paid to the use and configuration of HTTPS. For example, more and more domains begin to enable the HTTPS protocol to ensure a secure communication channel between users and websites. From the first measurement, we observed that many domains are still using TLS 1.0 and 1.1, SSL 2.0, and SSL 3.0 protocols to support user clients that use outdated systems. As the previous studies revealed security risks of using these protocols, in the subsequent studies, we found that the majority of domains updated their TLS protocol on time. Our 2020 results suggest that most HTTPS domains use the TLS 1.2 protocol and show that some HTTPS domains are still vulnerable to the existing known attacks. As academics and industry professionals continue to disclose attacks against HTTPS and recommend the secure configuration of HTTPS, we found that the number of vulnerable domain is gradually decreasing every year.


Sign in / Sign up

Export Citation Format

Share Document