scholarly journals Inferring SARS-CoV-2 RNA shedding into wastewater relative to time of infection

Author(s):  
Sean M. Cavany ◽  
Aaron Bivins ◽  
Zhenyu Wu ◽  
Devin North ◽  
Kyle Bibby ◽  
...  

Since the start of the COVID-19 pandemic, there has been interest in using wastewater monitoring as an approach for disease surveillance. A significant uncertainty that would improve interpretation of wastewater monitoring data is the intensity and timing with which individuals shed RNA from severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) into wastewater. By combining wastewater and case surveillance data sets from a university campus during a period of heightened surveillance, we inferred that individual shedding of RNA into wastewater peaks on average six days (50% uncertainty interval (UI): 6 - 7; 95% UI: 4 - 8) following infection, and is highly overdispersed (negative binomial dispersion parameter, k = 0.39 (95% credible interval: 0.32 - 0.48)). This limits the utility of wastewater surveillance as a leading indicator of secular trends in SARS-CoV-2 transmission during an epidemic, and implies that it could be most useful as an early warning of rising transmission in areas where transmission is low or clinical testing is delayed or of limited capacity.

Author(s):  
Hussein Ahmad Abdulsalam ◽  
Sule Omeiza Bashiru ◽  
Alhaji Modu Isa ◽  
Yunusa Adavi Ojirobe

Gompertz Rayleigh (GomR) distribution was introduced in an earlier study with few statistical properties derived and parameters estimated using only the most common traditional method, Maximum Likelihood Estimation (MLE). This paper aimed at deriving more statistical properties of the GomR distribution, estimating the three unknown parameters via a competitive method, Maximum Product of Spacing (MPS) and evaluating goodness of fit using rainfall data sets from Nigeria, Malaysia and Argentina. Properties of statistical distributions including distribution of smallest and largest order statistics, cumulative or integrated hazard function, odds function, rth non-central moments, moment generating function, mean, variance and entropy measures for GomR distribution were explicitly derived. The fitted data sets reveal the flexibility of GomR distribution over other distributions been compared with. Simulation study was used to evaluate the consistency, accuracy and unbiasedness of the GomR distribution parameter estimates obtained from the method of MPS. The study found that GomR distribution could not provide a better fit for Argentine rainfall data but it was the best distribution for the rainfall data sets from Nigeria and Malaysia in comparison with the distributions; Generalized Weibull Rayleigh (GWR), Exponentiated Weibull Rayleigh (EWR), Type (II) Topp Leone Generalized Inverse Rayleigh (TIITLGIR), Kumarawamy Exponential Inverse Raylrigh (KEIR), Negative Binomial Marshall-Olkin Rayleigh (NBMOR) and Exponentiated Weibull (EW). Furthermore, the estimates from MPSE were consistent as the sample size increases but not as efficient as those from MLE.


Author(s):  
Julia Nowack ◽  
Christopher Turbill

AbstractMaintaining a high and stable body temperature as observed in endothermic mammals and birds is energetically costly. Thus, it is not surprising that we discover more and more heterothermic species that can reduce their energetic needs during energetic bottlenecks through the use of torpor. However, not all heterothermic animals use torpor on a regular basis. Torpor may also be important to an individual’s probability of survival, and hence fitness, when used infrequently. We here report the observation of a single, ~ 5.5 h long hypothermic bout with a decrease in body temperature by 12 °C in the native Australian bush rat (Rattus fuscipes). Our data suggest that bush rats are able to rewarm from a body temperature of 24 °C, albeit with a rewarming rate lower than that expected on the basis of their body mass. Heterothermy, i.e. the ability to withstand and overcome periods of reduced body temperature, is assumed to be an evolutionarily ancestral (plesiomorphic) trait. We thus argue that such rare hypothermic events in species that otherwise appear to be strictly homeothermic could be heterothermic rudiments, i.e. a less derived form of torpor with limited capacity for rewarming. Importantly, observations of rare and extreme thermoregulatory responses by wild animals are more likely to be discovered with long-term data sets and may not only provide valuable insight about the physiological capability of a population, but can also help us to understand the constraints and evolutionary pathways of different phenologies.


Author(s):  
Robin Flowerdew

Most statistical analysis is based on the assumption that error is normally distributed, but many data sets are based on discrete data (the number of migrants from one place to another must be a whole number). Recent developments in statistics have often involved generalising methods so that they can be properly applied to non-normal data. For example, Nelder and Wedderburn (1972) developed the theory of generalised linear modelling, where the dependent or response variable can take a variety of different probability distributions linked in one of several possible ways to a linear predictor, based on a combination of independent or explanatory variables. Several common statistical techniques are special cases of the generalised linear models, including the usual form of regression analysis, Ordinary Least Squares regression, and binomial logit modelling. Another important special case is Poisson regression, which has a Poisson-distributed dependent variable, linked logarithmically to a linear combination of independent variables. Poisson regression may be an appropriate method when the dependent variable is constrained to be a non-negative integer, usually a count of the number of events in certain categories. It assumes that each event is independent of the others, though the probability of an event may be linked to available explanatory variables. This chapter illustrates how Poisson regression can be carried out using the Stata package, proceeding to discuss various problems and issues which may arise in the use of the method. The number of migrants from area i to area j must be a non-negative integer and is likely to vary according to zone population, distance and economic variables. The availability of high-quality migration data through the WICID facility permits detailed analysis at levels from the region to the output areas. A vast range of possible explanatory variables can also be derived from the 2001 Census data. Model results are discussed in terms of the significant explanatory variables, the overall goodness of fit and the big residuals. Comparisons are drawn with other analytic techniques such as OLS regression. The relationship to Wilson’s entropy maximising methods is described, and variants on the method are explained. These include negative binomial regression and zero-censored and zero-truncated models.


2019 ◽  
Vol 9 (22) ◽  
pp. 4818
Author(s):  
Usman Akhtar ◽  
Anita Sant’Anna ◽  
Sungyoung Lee

Vast amounts of data, especially in biomedical research, are being published as Linked Data. Being able to analyze these data sets is essential for creating new knowledge and better decision support solutions. Many of the current analytics solutions require continuous access to these data sets. However, accessing Linked Data at query time is prohibitive due to high latency in searching the content and the limited capacity of current tools to connect to these databases. To reduce this overhead cost, modern database systems maintain a cache of previously searched content. The challenge with Linked Data is that databases are constantly evolving and cached content quickly becomes outdated. To overcome this challenge, we propose a Change-Aware Maintenance Policy (CAMP) for updating cached content. We propose a Change Metric that quantifies the evolution of a Linked Dataset and determines when to update cached content. We evaluate our approach on two datasets and show that CAMP can reduce maintenance costs, improve maintenance quality and increase cache hit rates compared to standard approaches.


1984 ◽  
Vol 15 (3) ◽  
pp. 155-161
Author(s):  
C. Firer

In this article the concept of never-buyers of consumer non-durables is discussed. The traditional Negative Binomial Distribution approach of Ehrenberg to the question is presented. Previously unpublished work carried out at the Graduate School of Business Administration, University of the Witwatersrand, is reviewed and hypotheses are put forward that the observed large zero cell in the purchase frequency distributions may be caused by the existence of a group of never-buyers of the product, or by the superimposition of at least two distinct buying populations, previously identified as brand-loyal and multibrand/brand-switching households. The results of the research aimed at testing the first hypothesis are presented here. Two carefully monitored data sets were modelled using zero-augmented Negative Binomial and Sichel distributions. The data were previously shown to exhibit the necessary mean households purchase/consumption stationarity. Individual brands in one data set (purchases of toilet soap) were shown to follow the predictions of the traditional theory - the proportion of non-buyers decreasing with time. In the second data set (consumption of packaged soup) the proportion of non-consumers of the brands fell towards zero as the length of the time period studied was increased, but at a rate faster than that predicted by the theory. The hypothesis of the existence of never-buyers/users of individual brands in these two product classes was therefore rejected.


Parasitology ◽  
1998 ◽  
Vol 117 (6) ◽  
pp. 597-610 ◽  
Author(s):  
D. J. SHAW ◽  
B. T. GRENFELL ◽  
A. P. DOBSON

Frequency distributions from 49 published wildlife host–macroparasite systems were analysed by maximum likelihood for goodness of fit to the negative binomial distribution. In 45 of the 49 (90%) data-sets, the negative binomial distribution provided a statistically satisfactory fit. In the other 4 data-sets the negative binomial distribution still provided a better fit than the Poisson distribution, and only 1 of the data-sets fitted the Poisson distribution. The degree of aggregation was large, with 43 of the 49 data-sets having an estimated k of less than 1. From these 49 data-sets, 22 subsets of host data were available (i.e. host data could be divided by either host sex, age, where or when hosts were sampled). In 11 of these 22 subsets there was significant variation in the degree of aggregation between host subsets of the same host–parasite system. A common k estimate was always larger than that obtained with all the host data considered together. These results indicate that lumping host data can hide important variations in aggregation between hosts and can exaggerate the true degree of aggregation. Wherever possible common k estimates should be used to estimate the degree of aggregation. In addition, significant differences in the degree of aggregation between subgroups of host data, were generally associated with significant differences in both mean parasite burdens and the prevalence of infection.


Parasitology ◽  
1986 ◽  
Vol 92 (1) ◽  
pp. 227-243 ◽  
Author(s):  
E. L. Adjei ◽  
A. Barnes ◽  
R. J. G. Lester

SUMMARYThe frequency distribution of parasites in hosts commonly follows a negative binomial or similar distribution. Under certain conditions the magnitude of parasite-associated host mortality can be estimated by comparing the tail of the observed distribution to that of the distribution predicted from the first few points of the data. For the technique to work the following assumptions need to be met: mortality in lightly infected fish must be rare; infection and consequent mortality occur only in fish younger than those sampled; and the frequency distribution of the parasite at the time of infection should conform to a known probability distribution. The method was applied to frequency distributions of blastocysts of Callitetra rhynchus gracilis in 898 Saurida tumbil (Bloch) and 5013 S. undosquamis (Richardson). Parasite-associated mortality in S. tumbil was calculated to be at least 11 % in males and 2% in females. For S. undosquamis, estimated mortality was about 5% in males and 3% in females. The numbers of parasites estimated to produce a 0·5 probability of death, the parasitological equivalent of an LD50 were 3·4 and 5·7 for S. tumbil males and females, and 18 and 3 for S. undosquamis males and females respectively.


2016 ◽  
Vol 14 (06) ◽  
pp. 1650034 ◽  
Author(s):  
Naim Al Mahi ◽  
Munni Begum

One of the primary objectives of ribonucleic acid (RNA) sequencing or RNA-Seq experiment is to identify differentially expressed (DE) genes in two or more treatment conditions. It is a common practice to assume that all read counts from RNA-Seq data follow overdispersed (OD) Poisson or negative binomial (NB) distribution, which is sometimes misleading because within each condition, some genes may have unvarying transcription levels with no overdispersion. In such a case, it is more appropriate and logical to consider two sets of genes: OD and non-overdispersed (NOD). We propose a new two-step integrated approach to distinguish DE genes in RNA-Seq data using standard Poisson and NB models for NOD and OD genes, respectively. This is an integrated approach because this method can be merged with any other NB-based methods for detecting DE genes. We design a simulation study and analyze two real RNA-Seq data to evaluate the proposed strategy. We compare the performance of this new method combined with the three [Formula: see text]-software packages namely edgeR, DESeq2, and DSS with their default settings. For both the simulated and real data sets, integrated approaches perform better or at least equally well compared to the regular methods embedded in these [Formula: see text]-packages.


2013 ◽  
Vol 2013 ◽  
pp. 1-8 ◽  
Author(s):  
Panagiotis T. Nastos ◽  
Andreas Matzarakis

The objective of this work is the assessment of human thermal bioclimatic conditions in the Athens University Campus (AUC), including the Faculties and their respective Departments of the largest state institution of higher learning in Greece, and among the largest universities in Europe. The analysis of bioclimate was carried out, using the physiologically equivalent temperature (PET), which is based on the energy balance model of the human body. The meteorological data required for the calculation of PET concern hourly values of air temperature, relative humidity, wind speed and total solar radiation, for the time period 1999–2007. The recorded data sets were obtained from the meteorological station of the Laboratory of Climatology and Atmospheric Environment of the University of Athens. The results revealed the hours of the day in which thermal comfort or stress prevails, as well as the trends and variability of PET, for the studied period. Finally, the intense heat waves occurred during summer 2007 along with extreme cold conditions during December 2003-February 2004 were analyzed in terms of PET classes and compared to the respective average bioclimatic conditions of the study period.


Sign in / Sign up

Export Citation Format

Share Document