scholarly journals Debiasing Covid-19 prevalence estimates

Author(s):  
Sotiris Georganas ◽  
Alina Velias ◽  
Sotiris Vandoros

AbstractTimely, accurate epidemic figures are necessary for informed policy. In the Covid-19 pandemic, mismeasurement can lead to tremendous waste, in health or economic output. “Random” testing is commonly used to estimate virus prevalence, reporting daily positivity rates. However, since testing is necessarily voluntary, all “random” tests done in the field suffer from selection bias. This bias, unlike standard polling biases, goes beyond demographical representativeness and cannot be corrected by oversampling (i.e. selecting people without symptoms to test). Using controlled, incentivized experiments on a sample of all ages, we show that people who feel symptoms are up to 33 times more likely to seek testing. The bias in testing propensities leads to sizable prevalence bias: test positivity is inflated by up to five times, even if testing is costless. This effect varies greatly across time and age groups, making comparisons over time and across countries misleading. We validate our results using the REACT study in the UK and find that positivity figures have indeed a very large and time varying bias. We present calculations to debias positivity rates, but importantly, suggest a parsimonious way to sample the population bypassing the bias altogether. Our estimation is both real-time and consistently close to true values. These results are relevant for all epidemics, besides covid-19, when carriers have informative beliefs about their own status.

2021 ◽  
Author(s):  
James A Ackland ◽  
Graeme J Ackland ◽  
David J Wallace

Objective: To track the statistical case fatality rate (CFR) in the second wave of the UK coronavirus outbreak, and to understand its variations over time. Design: Publicly available UK government data and clinical evidence on the time between first positive PCR test and death are used to determine the relationships between reported cases and deaths, according to age groups and across regions in England. Main Outcome Measures: Estimates of case fatality rates and their variations over time. Results: Throughout October and November 2020, deaths in England can be broadly understood in terms of CFRs which are approximately constant over time. The same CFRs prove a poor predictor of deaths when applied back to September, when prevalence of the virus was comparatively low, suggesting that the potential effect of false positive tests needs to be taken into account. Similarly, increasing CFRs are needed to match cases to deaths when projecting the model forwards into December. The growth of the S gene dropout VOC in December occurs too late to explain this increase in CFR alone, but at 33% increased mortality, it can explain the peak in deaths in January. On our analysis, if there were other factors responsible for the higher CFRs in December and January, 33% would be an upper bound for the higher mortality of the VOC. From the second half of January, the CFRs for older age groups show a marked decline. Since the fraction of the VOC has not decreased, this decline is likely to be the result of the rollout of vaccination. However, due to the rapidly decreasing nature of the raw cases data (likely due to a combination of vaccination and lockdown), any imprecisions in the time-to-death distribution are greatly exacerbated in this time period, rendering estimates of vaccination effect imprecise. Conclusions: The relationship between cases and deaths, even when controlling for age, is not static through the second wave of coronavirus in England. An apparently anomalous low case-fatality ratio in September can be accounted for by a modest 0.4% false-positive fraction. The large jump in CFR in December can be understood in terms of a more deadly new variant B1.1.7, while a decline in January correlates with vaccine roll-out, suggesting that vaccine reduce the severity of infection, as well as the risk.


Heart ◽  
2018 ◽  
Vol 105 (1) ◽  
pp. 27-33 ◽  
Author(s):  
Nicola Jaime Adderley ◽  
Ronan Ryan ◽  
Krishnarajah Nirantharakumar ◽  
Tom Marshall

ObjectiveAtrial fibrillation (AF) is the most common cardiac arrhythmia and an important risk factor for stroke. Treatment with anticoagulants substantially reduces risk of stroke. Current prevalence and treatment rates of AF in the UK as well as changes in recent years are not known. The aim of this analysis was to determine trends in age–sex specific prevalence and treatment of AF in the UK from 2000 to 2016.Methods17 sequential cross-sectional analyses were carried out between 2000 and 2016 using a large database of electronic primary care records of patients registered with UK general practitioners. These determined the prevalence of patients diagnosed with AF, the stroke risk of those with AF and the proportion of AF patients currently receiving anticoagulants. Stroke risk was assessed using CHA2DS2-VASc score.ResultsAge–sex standardised AF prevalence increased from 2.14% (95% CI 2.11% to 2.17%) in 2000 to 3.29% (95% CI 3.27% to 3.32%) in 2016. Between 2000 and 2016, the proportion of patients with AF prescribed anticoagulants increased from 35.4% (95% CI 34.7% to 36.1%) to 75.5% (95% CI 75.1% to 75.8%) in those with high stroke risk (p for change over time <0.001) and from 32.8% (95% CI 30.5% to 35.2%) to 47.1% (95% CI 45.4% to 48.7%) in those with moderate stroke risk (p<0.001). In patients with low risk of stroke, the proportion decreased from 19.9% (95% CI 17.8% to 22.2%) to 9.7% (95% CI 8.4% to 11.1%) (p<0.001). Anticoagulant prescribing performance varied between practices; in 2016, the proportion of eligible patients treated was 82.9% (95% CI 82.2% to 83.7%) and 62.0% (95% CI 61.0% to 63.0%) in the highest-performing and lowest-performing practice quintiles, respectively. There was poor agreement in individual practice performance over time from 2006 to 2016: linear-weighted κ=0.10 (95% CI 0.02 to 0.19).ConclusionsFrom 2000 to 2016, the prevalence of recorded AF has increased in all age groups and both sexes. Anticoagulant treatment of eligible patients with AF has more than doubled, with marked improvements since 2011, alongside a reduction in the use of anticoagulants in ineligible patients with AF.


Author(s):  
Louise E. Smith ◽  
Henry W. W. Potts ◽  
Richard Amlot ◽  
Nicola T. Fear ◽  
Susan Michie ◽  
...  

Objectives: To investigate rates of adherence to the UKs test, trace and isolate system over time. Design: Time series of cross-sectional online surveys. Setting: Data were collected between 2 March and 5 August 2020. Participants: 42,127 responses from 31,787 people living in the UK, aged 16 years or over, are presented (21 survey waves, n≈2,000 per wave). Main outcome measures: Identification of the key symptoms of COVID-19 (cough, high temperature / fever, and loss of sense of smell or taste), self-reported adherence to self-isolation if symptomatic, requesting an antigen test if symptomatic, intention to share details of close contacts, self-reported adherence to quarantine if alerted that you had been in contact with a confirmed COVID-19 case. Results: Only 48.9% of participants (95% CI 48.2% to 49.7%) identified key symptoms of COVID-19. Self-reported adherence to test, trace and isolate behaviours was low (self-isolation 18.2%, 95% CI 16.4% to 19.9%; requesting an antigen test 11.9%, 95% CI 10.1% to 13.8%; intention to share details of close contacts 76.1%, 95% CI 75.4% to 76.8%; quarantining 10.9%, 95% CI 7.8% to 13.9%) and largely stable over time. By contrast, intention to adhere to protective measures was much higher. Non-adherence was associated with: men, younger age groups, having a dependent child in the household, lower socio-economic grade, greater hardship during the pandemic, and working in a key sector. Conclusions: Practical support and financial reimbursement is likely to improve adherence. Targeting messaging and policies to men, younger age groups, and key workers may also be necessary.


Author(s):  
José Novoa ◽  
Jorge Wuth ◽  
Juan Pablo Escudero ◽  
Josué Fredes ◽  
Rodrigo Mahu ◽  
...  

Author(s):  
Christopher Hood ◽  
Rozana Himaz

This chapter draws on historical statistics reporting financial outcomes for spending, taxation, debt, and deficit for the UK over a century to (a) identify quantitatively and compare the main fiscal squeeze episodes (i.e. major revenue increases, spending cuts, or both) in terms of type (soft squeezes and hard squeezes, spending squeezes, and revenue squeezes), depth, and length; (b) compare these periods of austerity against measures of fiscal consolidation in terms of deficit reduction; and (c) identify economic and financial conditions before and after the various squeezes. It explores the extent to which the identification of squeeze episodes and their classification is sensitive to which thresholds are set and what data sources are used. The chapter identifies major changes over time that emerge from this analysis over the changing depth and types of squeeze.


2018 ◽  
Vol 5 (3) ◽  
pp. 1322-1334 ◽  
Author(s):  
Philip E. Pare ◽  
Carolyn L. Beck ◽  
Angelia Nedic

2021 ◽  
pp. 1-60
Author(s):  
J.L. Buttriss ◽  
S.A. Lanham-New ◽  
S. Steenson ◽  
L. Levy ◽  
G.E. Swan ◽  
...  

Abstract A multi-disciplinary expert group met to discuss vitamin D deficiency in the UK, and strategies for improving population intakes and status. Changes to UK Government advice since the 1st Rank Forum on Vitamin D (2009) were discussed, including rationale for setting a RNI (10µg/day;400IU/day) for adults and children (4+ years). Current UK data show inadequate intakes among all age groups, and high prevalence of low vitamin D status among specific groups (e.g. pregnant women and adolescent males/females). Evidence of widespread deficiency within some minority ethnic groups, resulting in nutritional rickets (particularly among Black and South Asian infants), raised particular concern. It is too early to establish whether population vitamin D status has altered since Government recommendations changed in 2016. Vitamin D food fortification was discussed as a potential strategy to increase population intakes. Data from dose-response and dietary modelling studies indicate dairy products, bread, hens’ eggs and some meats as potential fortification vehicles. Vitamin D3 appears more effective than vitamin D2 for raising serum 25-hydroxyvitamin D concentration, which has implications for choice of fortificant. Other considerations for successful fortification strategies include: i) need for ‘real-world’ cost information for use in modelling work; ii) supportive food legislation; iii) improved consumer and health professional understanding of vitamin D’s importance; iv) clinical consequences of inadequate vitamin D status; v) consistent communication of Government advice across health/social care professions, and via the food industry. These areas urgently require further research to enable universal improvement in vitamin D intakes and status in the UK population.


2021 ◽  
Vol 9 (3) ◽  
pp. 311
Author(s):  
Ben R. Evans ◽  
Iris Möller ◽  
Tom Spencer

Salt marshes are important coastal environments and provide multiple benefits to society. They are considered to be declining in extent globally, including on the UK east coast. The dynamics and characteristics of interior parts of salt marsh systems are spatially variable and can fundamentally affect biotic distributions and the way in which the landscape delivers ecosystem services. It is therefore important to understand, and be able to predict, how these landscape configurations may evolve over time and where the greatest dynamism will occur. This study estimates morphodynamic changes in salt marsh areas for a regional domain over a multi-decadal timescale. We demonstrate at a landscape scale that relationships exist between the topology and morphology of a salt marsh and changes in its condition over time. We present an inherently scalable satellite-derived measure of change in marsh platform integrity that allows the monitoring of changes in marsh condition. We then demonstrate that easily derived geospatial and morphometric parameters can be used to determine the probability of marsh degradation. We draw comparisons with previous work conducted on the east coast of the USA, finding differences in marsh responses according to their position within the wider coastal system between the two regions, but relatively consistent in relation to the within-marsh situation. We describe the sub-pixel-scale marsh morphometry using a morphological segmentation algorithm applied to 25 cm-resolution maps of vegetated marsh surface. We also find strong relationships between morphometric indices and change in marsh platform integrity which allow for the inference of past dynamism but also suggest that current morphology may be predictive of future change. We thus provide insight into the factors governing marsh degradation that will assist the anticipation of adverse changes to the attributes and functions of these critical coastal environments and inform ongoing ecogeomorphic modelling developments.


Sign in / Sign up

Export Citation Format

Share Document