scholarly journals Detection and abundance of SARS-CoV-2 in wastewater in Liechtenstein, and the estimation of prevalence and impact of the B.1.1.7 variant

Author(s):  
R. Markt ◽  
L. Endler ◽  
F. Amman ◽  
A. Schedl ◽  
T. Penz ◽  
...  

Abstract The new coronavirus 2 (SARS-CoV-2) is known to be also shed through feces, which makes wastewater-based surveillance possible, independent of symptomatic cases and unbiased by any testing strategies and frequencies. We investigated the entire population of the Principality of Liechtenstein with samples from the wastewater treatment plant Bendern (serving all 39,000 inhabitants). Twenty-four-hour composite samples were taken once or twice a week during a period of 6 months from September 2020 to March 2021. Viral RNA was concentrated using the PEG centrifugation method followed by reverse transcription quantitative PCR. The aim of this research was to assess the suitability of COVID-19 fragments to relate the viral wastewater signal to the incidences and assess the impact of the emerging B.1.1.7. variant. The viral load in the wastewater peaked at almost 9 × 108 viral fragments per person equivalent (PE) and day on October 25, and showed a second peak on December 22 reaching a viral load of approximately 2 × 108 PE−1d−1. Individual testing showed a lag of 4 days and a distinct underestimation of cases at the first peak when testing frequency was low. The wastewater signal showed an immediate response on the implementation of non-pharmaceutical interventions. The new virus variant B.1.1.7. was first detected in wastewater on December 23, while it was first observed with individual testing not before January 13, 2021. Further, our data indicate that the emergence of new virus variant may change the wastewater signal, probably due to different shedding patterns, which should be considered in future models.

PLoS ONE ◽  
2021 ◽  
Vol 16 (3) ◽  
pp. e0248783 ◽  
Author(s):  
Gregory D. Lyng ◽  
Natalie E. Sheils ◽  
Caleb J. Kennedy ◽  
Daniel O. Griffin ◽  
Ethan M. Berke

Background COVID-19 test sensitivity and specificity have been widely examined and discussed, yet optimal use of these tests will depend on the goals of testing, the population or setting, and the anticipated underlying disease prevalence. We model various combinations of key variables to identify and compare a range of effective and practical surveillance strategies for schools and businesses. Methods We coupled a simulated data set incorporating actual community prevalence and test performance characteristics to a susceptible, infectious, removed (SIR) compartmental model, modeling the impact of base and tunable variables including test sensitivity, testing frequency, results lag, sample pooling, disease prevalence, externally-acquired infections, symptom checking, and test cost on outcomes including case reduction and false positives. Findings Increasing testing frequency was associated with a non-linear positive effect on cases averted over 100 days. While precise reductions in cumulative number of infections depended on community disease prevalence, testing every 3 days versus every 14 days (even with a lower sensitivity test) reduces the disease burden substantially. Pooling provided cost savings and made a high-frequency approach practical; one high-performing strategy, testing every 3 days, yielded per person per day costs as low as $1.32. Interpretation A range of practically viable testing strategies emerged for schools and businesses. Key characteristics of these strategies include high frequency testing with a moderate or high sensitivity test and minimal results delay. Sample pooling allowed for operational efficiency and cost savings with minimal loss of model performance.


2021 ◽  
Author(s):  
Andrea Torneri ◽  
Lander Willem ◽  
Vittoria Colizza ◽  
Cecile Kremer ◽  
Christelle Meuris ◽  
...  

SARS-CoV-2 remains a worldwide emergency. While vaccines have been approved and are widely administered, these are only available to adults and adolescents in Europe. Therefore, in order to mitigate the spread of more transmissible SARS-CoV-2 variants among children, the use of non-pharmaceutical interventions is still warranted. We investigate the impact of different testing strategies on the SARS-CoV-2 infection dynamics in a primary school environment, using an individual-based modelling approach. Specifically, we consider three testing strategies: 1) symptomatic isolation, where we test symptomatic individuals and isolate them when they test positive, 2) reactive screening, where a class is screened once one symptomatic individual was identified, and 3) repetitive screening, where the school in its entirety is screened on regular time intervals. Through this analysis, we demonstrate that repetitive testing strategies can significantly reduce the attack rate in schools, contrary to a reactive screening approach. Furthermore, we investigate the impact of these testing strategies on the average number of school days lost per child.


2021 ◽  
Author(s):  
Andrea D. George ◽  
Devrim Kaya ◽  
Blythe A. Layton ◽  
Kestrel Bailey ◽  
Christine Kelly ◽  
...  

With the rapid onset of the COVID-19 pandemic, wastewater-based epidemiology (WBE) sampling methodologies for SARS-CoV-2 were often implemented quickly and may not have taken the unique drainage catchment characteristics into account. One question of debate is the relevance of grab versus composite samples when surveying for SARS-CoV-2 at various catchment scales. This study assessed the impact of grab versus composite sampling on the detection and quantification of SARS-CoV-2 in catchment basins with flow rates ranging from high-flow (wastewater treatment plant influent), to medium-flow (neighborhood-scale micro-sewershed), to low-flow (city block-scale micro-sewershed) and down to ultra-low flow (building scale). At the high-flow site, grab samples were reasonably comparable to 24-h composite samples with the same non-detect rate (0%) and SARS-CoV-2 concentrations that differed by 32% on the Log10 scale. However, as the flow rates decreased, the percentage of false-negative grab samples increased up to 44% and the SARS-CoV-2 concentrations of grab samples varied by up to 1-2 orders of magnitude compared to their respective composite sample concentrations. At the ultra-low-flow site, increased sampling frequencies down to every 5 min led to composite samples with higher fidelity to the SARS-CoV-2 load. Thus, composite sampling is superior to grab sampling, especially as flow decreases.


Author(s):  
Gregory D. Lyng ◽  
Natalie E. Sheils ◽  
Caleb J. Kennedy ◽  
Daniel Griffin ◽  
Ethan M. Berke

ABSTRACTBackgroundCOVID-19 test sensitivity and specificity have been widely examined and discussed yet optimal use of these tests will depend on the goals of testing, the population or setting, and the anticipated underlying disease prevalence. We model various combinations of key variables to identify and compare a range of effective and practical surveillance strategies for schools and businesses.MethodsWe coupled a simulated data set incorporating actual community prevalence and test performance characteristics to a susceptible, infectious, removed (SIR) compartmental model, modeling the impact of base and tunable variables including test sensitivity, testing frequency, results lag, sample pooling, disease prevalence, externally-acquired infections, and test cost on outcomes case reduction.ResultsIncreasing testing frequency was associated with a non-linear positive effect on cases averted over 100 days. While precise reductions in cumulative number of infections depended on community disease prevalence, testing every 3 days versus every 14 days (even with a lower sensitivity test) reduces the disease burden substantially. Pooling provided cost savings and made a high-frequency approach practical; one high-performing strategy, testing every 3 days, yielded per person per day costs as low as $1.32.ConclusionsA range of practically viable testing strategies emerged for schools and businesses. Key characteristics of these strategies include high frequency testing with a moderate or high sensitivity test and minimal results delay. Sample pooling allowed for operational efficiency and cost savings with minimal loss of model performance.


2020 ◽  
Vol 11 (SPL1) ◽  
pp. 796-806
Author(s):  
Sana M Kamal ◽  
Ali Al-Samydai ◽  
Rudaina Othman Yousif ◽  
Talal Aburjai

COVID-19 pandemic has spread across the world, which considered a relative of the severe acute respiratory syndrome (SARS), with possibility of transmission from animals to human and effect each of health and economic. Several preventative strategies and non-pharmaceutical interventions have been used to slow down the spread of COVID-19. The questionnaire contained 36 questions regarding the impact of COVID-19 quarantine on children`s behaviors and language have been distributed online (Google form). Data collected after asking parents about their children behavior during quarantine, among the survey completers (n=469), 42.3% were female children, and 57.7 were male children. Results showed that quarantine has an impact on children`s behaviors and language, where stress and isolationism has a higher effect, while social relations had no impact. The majority of the respondents (75.0%) had confidence that community pharmacies can play an important role in helping families in protection their children`s behaviors and language as they made the highest contact with pharmacists during quarantine. One of the main recommendations that could be applied to help parents protection and improvement their children`s behaviors and language in quarantine condition base on simple random sample opinion is increasing the role of community pharmacies inpatient counseling and especially towards children after giving courses to pharmacists in child psychology and behavior. This could be helpful to family to protect their children, from any changing in them behaviors and language in such conditions in the future if the world reface such the same problem.


2005 ◽  
Vol 40 (4) ◽  
pp. 491-499 ◽  
Author(s):  
Jeremy T. Kraemer ◽  
David M. Bagley

Abstract Upgrading conventional single-stage mesophilic anaerobic digestion to an advanced digestion technology can increase sludge stability, reduce pathogen content, increase biogas production, and also increase ammonia concentrations recycled back to the liquid treatment train. Limited information is available to assess whether the higher ammonia recycle loads from an anaerobic sludge digestion upgrade would lead to higher discharge effluent ammonia concentrations. Biowin, a commercially available wastewater treatment plant simulation package, was used to predict the effects of anaerobic digestion upgrades on the liquid train performance, especially effluent ammonia concentrations. A factorial analysis indicated that the influent total Kjeldahl nitrogen (TKN) and influent alkalinity each had a 50-fold larger influence on the effluent NH3 concentration than either the ambient temperature, liquid train SRT or anaerobic digestion efficiency. Dynamic simulations indicated that the diurnal variation in effluent NH3 concentration was 9 times higher than the increase due to higher digester VSR. Higher recycle NH3 loads caused by upgrades to advanced digestion techniques can likely be adequately managed by scheduling dewatering to coincide with periods of low influent TKN load and ensuring sufficient alkalinity for nitrification.


1988 ◽  
Vol 20 (11-12) ◽  
pp. 131-136 ◽  
Author(s):  
A. D. Wong ◽  
C. D. Goldsmith

The effect of discharging specific oil degrading bacteria from a chemostat to a refinery activated sludge process was determined biokinetically. Plant data for the kinetic evaluation of the waste treatment plant was collected before and during treatment. During treatment, the 500 gallon chemostatic growth chamber was operated on an eight hour hydraulic retention time, at a neutral pH, and was fed a mixture of refinery wastewater and simple sugars. The biokinetic constants k (days−1), Ks (mg/L), and K (L/mg-day) were determined before and after treatment by Monod and Lineweaver-Burk plots. Solids discharged and effluent organic concentrations were also evaluated against the mean cell retention time (MCRT). The maximum utilization rate, k, was found to increase from 0.47 to 0.95 days−1 during the operation of the chemostat. Subsequently, Ks increased from 141 to 556 mg/L. Effluent solids were shown to increase slightly with treatment. However, this was acceptable due to the polishing pond and the benefit of increased ability to accept shock loads of oily wastewater. The reason for the increased suspended solids in the effluent was most likely due to the continual addition of bacteria in exponential growth that were capable of responding to excess substrate. The effect of the chemostatic addition of specific microbial inocula to the refinery waste treatment plant has been to improve the overall organic removal capacity along with subsequent gains in plant stability.


1992 ◽  
Vol 26 (5-6) ◽  
pp. 1255-1264
Author(s):  
K. L. Martins

During treatment of groundwater, radon is often coincidentally removed by processes typically used to remove volatile organic compounds (VOCs)-for example, processes such as liquid-phase granular activated carbon (LGAC) adsorption and air stripping with vapor-phase carbon (VGAC). The removal of radon from drinking water is a positive benefit for the water user; however, the accumulation of radon on activated carbon may cause radiologic hazards for the water treatment plant operators and the spent carbon may be considered a low-level radioactive waste. To date, most literature on radon removal by water treatment processes was based on bench- or residential-scale systems. This paper addresses the impact of radon on municipal and industrial-scale applications. Available data have been used todevelop graphical methods of estimating the radioactivity exposure rates to facility operators and determine the fate of spent carbon. This paper will allow the reader to determine the potential for impact of radon on the system design and operation as follows.Estimate the percent removal of radon from water by LGAC adsorbers and packed tower air strippers. Also, a method to estimate the percent removal of radon by VGAC used for air stripper off-gas will be provided.Estimate if your local radon levels are such that the safety guidelines, suggested by USEPA (United States Environmental Protection Agency), of 25 mR/yr (0.1 mR/day) for radioactivity exposure may or may not be exceeded.Estimate the disposal requirements of the waste carbon for LGAC systems and VGAC for air stripper “Off-Gas” systems. Options for dealing with high radon levels are presented.


1998 ◽  
Vol 37 (1) ◽  
pp. 347-354 ◽  
Author(s):  
Ole Mark ◽  
Claes Hernebring ◽  
Peter Magnusson

The present paper describes the Helsingborg Pilot Project, a part of the Technology Validation Project: “Integrated Wastewater” (TVP) under the EU Innovation Programme. The objective of the Helsingborg Pilot Project is to demonstrate implementation of integrated tools for the simulation of the sewer system and the wastewater treatment plant (WWTP), both in the analyses and the operational phases. The paper deals with the programme for investigating the impact of real time control (RTC) on the performance of the sewer system and wastewater treatment plant. As the project still is in a very early phase, this paper focuses on the modelling of the transport of pollutants and the evaluation of the effect on the sediment deposition pattern from the implementation of real time control in the sewer system.


Sign in / Sign up

Export Citation Format

Share Document