scholarly journals Using high-resolution contact networks to evaluate SARS-CoV-2 transmission and control in large-scale multi-day events

Author(s):  
Rachael Pung ◽  
Josh A Firth ◽  
Lewis G Spurgin ◽  
Vernon J Lee ◽  
Adam J Kucharski ◽  
...  

The emergence of the highly transmissible SARS-CoV-2 Delta variant has created a need to reassess the risk posed by increasing social contacts as countries resume pre-pandemic activities, particularly in the context of resuming large-scale events over multiple days. To examine how social contacts formed in different activity settings influences interventions required to control outbreaks, we combined high-resolution data on contacts among passengers and crew on cruise ships with network transmission models. We found passengers had a median of 20 (IQR 10-36) unique close contacts per day, and over 60% of their contact episodes were made in dining or sports areas where mask wearing is typically limited. In simulated outbreaks, we found that vaccination coverage and rapid antigen tests had a larger effect than mask mandates alone, indicating the importance of combined interventions against Delta to reduce event risk in the vaccine era.

IUCrJ ◽  
2020 ◽  
Vol 7 (4) ◽  
pp. 681-692
Author(s):  
Martin Malý ◽  
Kay Diederichs ◽  
Jan Dohnálek ◽  
Petr Kolenko

Crystallographic resolution is a key characteristic of diffraction data and represents one of the first decisions an experimenter has to make in data evaluation. Conservative approaches to the high-resolution cutoff determination are based on a number of criteria applied to the processed X-ray diffraction data only. However, high-resolution data that are weaker than arbitrary cutoffs can still result in the improvement of electron-density maps and refined structure models. Therefore, the impact of reflections from resolution shells higher than those previously used in conservative structure refinement should be analysed by the paired refinement protocol. For this purpose, a tool called PAIREF was developed to provide automation of this protocol. As a new feature, a complete cross-validation procedure has also been implemented. Here, the design, usage and control of the program are described, and its application is demonstrated on six data sets. The results prove that the inclusion of high-resolution data beyond the conventional criteria can lead to more accurate structure models.


2014 ◽  
Vol 95 (3) ◽  
pp. 409-426 ◽  
Author(s):  
Juanzhen Sun ◽  
Ming Xue ◽  
James W. Wilson ◽  
Isztar Zawadzki ◽  
Sue P. Ballard ◽  
...  

Traditionally, the nowcasting of precipitation was conducted to a large extent by means of extrapolation of observations, especially of radar ref lectivity. In recent years, the blending of traditional extrapolation-based techniques with high-resolution numerical weather prediction (NWP) is gaining popularity in the nowcasting community. The increased need of NWP products in nowcasting applications poses great challenges to the NWP community because the nowcasting application of high-resolution NWP has higher requirements on the quality and content of the initial conditions compared to longer-range NWP. Considerable progress has been made in the use of NWP for nowcasting thanks to the increase in computational resources, advancement of high-resolution data assimilation techniques, and improvement of convective-permitting numerical modeling. This paper summarizes the recent progress and discusses some of the challenges for future advancement.


2016 ◽  
Author(s):  
Matthias Zink ◽  
Rohini Kumar ◽  
Matthias Cuntz ◽  
Luis Samaniego

Abstract. Long term, high-resolution data about hydrologic fluxes and states are needed for many hydrological applications. Because continuous large-scale observations of such variables are not feasible, hydrologic or land surface models are applied to derive them. This study aims to analyze and provide a consistent high-resolution dataset of land surface variables over Germany, accounting for uncertainties caused by equifinal model parameters. The mesoscale Hydrological Model (mHM) is employed to derive an ensemble (100 members) of evapotranspiration, groundwater recharge, soil moisture and generated runoff at high spatial and temporal resolutions (4 km and daily, respectively) for the period 1951–2010. The model is cross-evaluated against the observed runoff in 222 catchments, which are not used for model calibration. The mean (standard deviation) of the ensemble median NSE estimated for these catchments is 0.68 (0.09) for daily discharge simulations. The modeled evapotranspiration and soil moisture reasonably represent the observations from eddy covariance stations. Our analysis indicates the lowest parametric uncertainty for evapotranspiration, and the largest is observed for groundwater recharge. The uncertainty of the hydrologic variables varies over the course of a year, with the exception of evapotranspiration, which remains almost constant. This study emphasizes the role of accounting for the parametric uncertainty in model-derived hydrological datasets.


2020 ◽  
Author(s):  
Dirk Eilander ◽  
Willem van Verseveld ◽  
Dai Yamazaki ◽  
Albrecht Weerts ◽  
Hessel C. Winsemius ◽  
...  

Abstract. Distributed hydrological models rely on hydrography data such as flow direction, river length, slope and width. For large-scale applications, many of these models still rely on a few flow-direction datasets, which are often manually derived. We propose the Iterative Hydrography Upscaling (IHU) method to upscale high-resolution flow direction data to the typically coarser resolutions of distributed hydrological models. The IHU aims to preserve the upstream-downstream relationship of river structure, including basin boundaries, river meanders and confluences, in the D8 format, which is commonly used to describe river networks in models. Additionally, it derives sub-grid river attributes such as drainage area, river length, slope and width. We derived the multi-resolution MERIT Hydro IHU dataset at resolutions of 30 arcsec (~1 km), 5 arcmin (~10 km) and 15 arcmin (~30 km) by applying IHU to the recently published 3 arcsec MERIT Hydro data. Results indicate improved accuracy of IHU at all resolutions studied compared to other often applied methods. Furthermore, we show that using IHU-derived hydrography data minimizes the errors made in timing and magnitude of simulated peak discharge throughout the Rhine basin compared to simulations at the native data resolutions. As the method is fully automated, it can be applied to other high-resolution hydrography datasets to increase the accuracy and enhance the uptake of new datasets in distributed hydrological models in the future.


2019 ◽  
Vol 1 ◽  
pp. 1-1
Author(s):  
Andrey Medvedev ◽  
Natalia Alekseenko ◽  
Natalia Telnova ◽  
Alexander Koshkarev

<p><strong>Abstract.</strong> Assessment and monitoring of environmental features based on large-scale and ultra-high resolution data, including remote sensing data, which have advantages in the repeatability of information and the speed of processing of incoming data, often face issues of completeness and duration of time series in retrospective analysis. Cartographic materials and remote sensing data allow monitoring for rapidly changing natural and anthropogenic features in the study areas, but very often face a problem when an event or phenomenon occurred many years ago and it is necessary to make a complete chronology.</p><p>Ultra-high-resolution data, remote sensing data and the results of the subsequent geoinformation analysis are widely used to solve problems in a number of socio-economic areas of territorial development, in particular:</p><ul><li>in environmental studies &amp;ndash; identification of local sources of water pollution, the consequences of their impact onecosystems, synthetic assessment of the ecological state of the territories and their comfort;</li><li>in the management of various resources, including water &amp;ndash; determination of biological productivity of water bodies, identification of water bioresources, detection of anthropogenically provoked and natural changes in water mass,implementation for glaciological studies, etc.</li></ul><p>Within the framework of the current study, a multi-time analysis of the water area and the coastal strip of Lake Sevan (the Republic of Armenia) at an altitude of about 1900 m above sea level, was carried out. The lake has repeatedly beensubjected to changes in the water level of the reservoir in the past. The 1930s and in the period between 1949 to 1962 were noted by the most intense drop in water level (more than 10 meters). In the 1990s, there was a slight increase inthe level, and then until 2001, the level of the lake continued to decrease.</p><p>The main factors affecting aquatic ecosystems and the overall ecological status of the lake are:</p><ol><li>Repeated changes in the water level of the reservoir in the past and its expected fluctuations in the future.</li><li>The uncontrolled discharge of harmful substances caused great damage to the lake, which affected the water qualityand biodiversity of this unique natural site.</li><li>Untimely cleaning of flooded forests, which increases the risk of eutrophication of the lake.</li><li>The poorly organized system of waste disposal and unauthorized landfills of municipal solid waste, as well as animalwaste.</li><li>Unauthorized construction of recreational facilities and capital structures in the coastal and water protection zonewhich may be flooded.</li></ol><p> The information support of the study is based on the materials of satellite imagery from the worldview2, SPOT 5/6,Resurs-P, Canopus-B, materials from the international space station (ISS), materials of archival aerial photography anddata obtained from the UAVs, in combination with other map data sources in the range of scales 1&amp;thinsp;:&amp;thinsp;5&amp;thinsp;000 &amp;ndash; 1&amp;thinsp;:&amp;thinsp;100&amp;thinsp;000,including digital topographic maps, land use maps, statistical and literary data. In fact, cartographic materials andremote sensing data provide a time history of 75 years, from large-scale topographic maps of 1942&amp;ndash;1943 to highlydetailed images of 2017&amp;ndash;2018.</p><p>According to the results of the study, it was possible to establish the position of the coastline for different time periods.The period between 1949 and 1962, when there was the most critical drop in the water level, was especially interestingand had not been studied before. Archival aerial photographs for 1943 and 1963 allowed to reconstruct the position ofthe coastline for almost every year of irrational water use.</p>


Author(s):  
Yatharth Ranjan ◽  
Zulqarnain Rashid ◽  
Callum Stewart ◽  
Maximilian Kerz ◽  
Mark Begale ◽  
...  

BACKGROUND With a wide range of use cases in both research and clinical domains, collecting continuous mobile health (mHealth) streaming data from multiple sources in a secure, highly scalable and extensible platform is of high interest to the open source mHealth community. The EU IMI RADAR-CNS program is an exemplar project with the requirements to support collection of high resolution data at scale; as such, the RADAR-base platform is designed to meet these needs and additionally facilitate a new generation of mHealth projects in this nascent field. OBJECTIVE Wide-bandwidth networks, smartphone penetrance and wearable sensors offer new possibilities for collecting (near) real-time high resolution datasets from large numbers of participants. We aimed to build a platform that would cater for large scale data collection for remote monitoring initiatives. Key criteria are around scalability, extensibility, security and privacy. METHODS RADAR-base is developed as a modular application, the backend is built on a backbone of the highly successful Confluent/Apache Kafka framework for streaming data. To facilitate scaling and ease of deployment, we use Docker containers to package the components of the platform. RADAR-base provides two main mobile apps for data collection, a Passive App and an Active App. Other 3rd Party Apps and sensors are easily integrated into the platform. Management user interfaces to support data collection and enrolment are also provided. RESULTS General principles of the platform components and design of RADAR-base are presented here, with examples of the types of data currently being collected from devices used in RADAR-CNS projects: Multiple Sclerosis, Epilepsy and Depression cohorts. CONCLUSIONS RADAR-base is a fully functional, remote data collection platform built around Confluent/Apache Kafka and provides off-the-shelf components for projects interested in collecting mHealth datasets at scale.


Antiquity ◽  
2016 ◽  
Vol 90 (354) ◽  
pp. 1670-1680 ◽  
Author(s):  
Jane Kershaw ◽  
Ellen C. Røyrvik

The recently concluded ‘People of the British Isles’ project (hereafter PoBI) combined large-scale, local DNA sampling with innovative data analysis to generate a survey of the genetic structure of Britain in unprecedented detail; the results were presented by Leslie and colleagues in 2015. Comparing clusters of genetic variation within Britain with DNA samples from Continental Europe, the study elucidated past immigration events via the identification and dating of historic admixture episodes (the interbreeding of two or more different population groups). Among its results, the study found “no clear genetic evidence of the Danish Viking occupation and control of a large part of England, either in separate UK clusters in that region, or in estimated ancestry profiles”, therefore positing “a relatively limited input of DNA from the Danish Vikings”, with ‘Danish Vikings’ defined in the study, and thus in this article, as peoples migrating from Denmark to eastern England in the late ninth and early tenth centuries (Leslieet al.2015: 313). Here, we consider the details of certain assumptions that were made in the study, and offer an alternative interpretation to the above conclusion. We also comment on the substantial archaeological and linguistic evidence for a large-scale Danish Viking presence in England.


2017 ◽  
Vol 21 (3) ◽  
pp. 1769-1790 ◽  
Author(s):  
Matthias Zink ◽  
Rohini Kumar ◽  
Matthias Cuntz ◽  
Luis Samaniego

Abstract. Long-term, high-resolution data about hydrologic fluxes and states are needed for many hydrological applications. Because continuous large-scale observations of such variables are not feasible, hydrologic or land surface models are applied to derive them. This study aims to analyze and provide a consistent high-resolution dataset of land surface variables over Germany, accounting for uncertainties caused by equifinal model parameters. The mesoscale Hydrological Model (mHM) is employed to derive an ensemble (100 members) of evapotranspiration, groundwater recharge, soil moisture, and runoff generated at high spatial and temporal resolutions (4 km and daily, respectively) for the period 1951–2010. The model is cross-evaluated against the observed daily streamflow in 222 basins, which are not used for model calibration. The mean (standard deviation) of the ensemble median Nash–Sutcliffe efficiency estimated for these basins is 0.68 (0.09) for daily streamflow simulations. The modeled evapotranspiration and soil moisture reasonably represent the observations from eddy covariance stations. Our analysis indicates the lowest parametric uncertainty for evapotranspiration, and the largest is observed for groundwater recharge. The uncertainty of the hydrologic variables varies over the course of a year, with the exception of evapotranspiration, which remains almost constant. This study emphasizes the role of accounting for the parametric uncertainty in model-derived hydrological datasets.


2021 ◽  
Vol 2021 ◽  
pp. 1-23
Author(s):  
Shu-Wen Chen ◽  
Xiao-Wei Gu ◽  
Jia-Ji Wang ◽  
Hui-Sheng Zhu

The pandemic of COVID-19 is continuing to wreak havoc in 2021, with at least 170 million victims around the world. Healthcare systems are overwhelmed by the large-scale virus infection. Luckily, Internet of Things (IoT) is one of the most effective paradigms in the intelligent world, in which the technology of artificial intelligence (AI), like cloud computing and big data analysis, is playing a vital role in preventing the spread of the pandemic of COVID-19. AI and 5G technologies are advancing by leaps and bounds, further strengthening the intelligence and connectivity of IoT applications, and conventional IoT has been gradually upgraded to be more powerful AI + IoT (AIoT). For example, in terms of remote screening and diagnosis of COVID-19 patients, AI technology based on machine learning and deep learning has recently upgraded medical equipment significantly and has reshaped the workflow with minimal contact with patients, so medical specialists can make clinical decisions more efficiently, providing the best protection not only to patients but also to specialists themselves. This paper reviews the latest progress made in combating COVID-19 with both IoT and AI and also provides comprehensive details on how to combat the pandemic of COVID-19 as well as the technologies that may be applied in the future.


Sign in / Sign up

Export Citation Format

Share Document