scholarly journals Timing of Breakthrough Infection Risk After Vaccination Against SARS-CoV-2

Author(s):  
Ashleigh Tuite ◽  
Nelson Lee ◽  
David Fisman

Background: Provision of safe and effective vaccines has been a remarkable public health achievement during the SARS-CoV-2 pandemic. The effectiveness and durability of protection of the first two doses of SARS-CoV-2 vaccines is an important area for study, as are questions related to optimal dose combinations and dosing intervals. Methods: We performed a case-cohort study to generate real-world evidence on efficacy of first and second dose of SARS-CoV-2 vaccines, using a population-based case line list and vaccination database for the province of Ontario, Canada between December 2020 and October 2021. Risk of infection after vaccination was evaluated in all laboratory-confirmed vaccinated SARS-CoV-2 cases, and a 2% sample of vaccinated controls, evaluated using survival analytic methods, including construction of Cox proportional hazards models. Vaccination status was treated as a time-varying covariate. Results: First and second doses of SARS-CoV-2 vaccine markedly reduced risk of infection (first dose efficacy 68%, 95% CI 67% to 69%; second dose efficacy 88%, 95% CI 87 to 88%). In multivariable models, extended dosing intervals were associated with lowest risk of breakthrough infection (HR for redosing 0.64 (95% CI 0.61 to 0.67) at 6-8 weeks). Heterologous vaccine schedules that mixed viral vector vaccine first doses with mRNA second doses were significantly more effective than mRNA only vaccines. Risk of infection largely vanished during the time period 4-6 months after the second vaccine dose, but rose markedly thereafter. Interpretation: A case-cohort design provided an efficient means to identify strong protective effects associated with SARS-CoV-2 vaccination, particularly after the second dose of vaccine. However, this effect appeared to wane once more than 6 months had elapsed since vaccination. Heterologous vaccination and extended dosing intervals improved the durability of immune response.

2021 ◽  
Author(s):  
Madhumita Shrotri ◽  
Maria Krutikov ◽  
Tom Palmer ◽  
Rebecca Giddings ◽  
Borscha Azmi ◽  
...  

Background The effectiveness of SARS-CoV-2 vaccines in frail older adults living in Long-Term Care Facilities (LTCFs) is uncertain. We estimated protective effects of the first dose of ChAdOx1 and BNT162b2 vaccines against infection in this population. Methods Cohort study comparing vaccinated and unvaccinated LTCF residents in England, undergoing routine asymptomatic testing (8 December 2020 - 15 March 2021). We estimated the relative hazard of PCR-positive infection using Cox proportional hazards regression, adjusting for age, sex, prior infection, local SARS-CoV-2 incidence, LTCF bed capacity, and clustering by LTCF. Results Of 10,412 residents (median age 86 years) from 310 LTCFs, 9,160 were vaccinated with either ChAdOx1 (6,138; 67%) or BNT162b2 (3,022; 33%) vaccines. A total of 670,628 person days and 1,335 PCR-positive infections were included. Adjusted hazard ratios (aHRs) for PCR-positive infection relative to unvaccinated residents declined from 28 days following the first vaccine dose to 0.44 (0.24, 0.81) at 28-34 days and 0.38 (0.19, 0.77) at 35-48 days. Similar effect sizes were seen for ChAdOx1 (aHR 0.32 [0.15-0.66] and BNT162b2 (aHR 0.35 [0.17, 0.71]) vaccines at 35-48 days. Mean PCR cycle threshold values were higher, implying lower infectivity, for infections 28 or more days post-vaccination compared with those prior to vaccination (31.3 vs 26.6, p<0.001). Interpretation The first dose of BNT162b2 and ChAdOx1 vaccines was associated with substantially reduced SARS-CoV-2 infection risk in LTCF residents from 4 weeks to at least 7 weeks. Funding UK Government Department of Health and Social Care.


2021 ◽  
Author(s):  
Diane Uschner ◽  
Matthew Bott ◽  
Michele Santacatterina ◽  
Mihili P Gunaratne ◽  
Lida Fette ◽  
...  

Importance: Real-world data are needed to assess incidence and factors associated with breakthrough SARS-CoV-2 infections following vaccination. Objective: Estimate incidence of breakthrough infections and assess associations with risk factors using self-reported data from a large NC population sample. Design: Prospective observational cohort study utilizing daily online survey data to capture information about COVID-19 symptoms, testing, and vaccination status. Setting: Six health care systems in North Carolina with data collected between January 15, 2021 and September 24, 2021. Participants: Adult study participants who reported full vaccination with a COVID-19 mRNA or J&J non-replicating viral vector vaccine (n = 16,020). Exposures: Potential community exposure to SARS-CoV-2. Main Outcome and Measures: Self-reported breakthrough infection. Results: SARS-CoV-2 infection after vaccination was self-reported in 1.9% of participants, with an incidence rate of 7.3 per 100,000 person-years. Younger age (45-64 vs. 18-44: HR (95% CI) = 0.65 (0.51-0.82); 65+ vs. 18-44: HR (95% CI) = 0.59 (0.39-0.90)), and vaccination with J&J Ad26.COV2.S were associated with a higher risk of breakthrough infection compared to vaccination with Pfizer BNT162b2 (Ad26.COV2.S vs. BNT162b2: HR (95% CI) = 2.23 (1.40-3.56)), while participants vaccinated with mRNA-1273 (mRNA-1273 vs. BNT162b2: HR (95% CI) = 0.69 (0.50-0.96) and those residing in urban counties experienced a lower rate of SARS-CoV-2 breakthrough infection compared with those from suburban (HR (95% CI) = 1.39 (1.01-1.90)) or rural (HR (95% CI) = 1.57 (1.16-2.11)) counties. There was no significant association between breakthrough infection and participant sex, race, healthcare worker status, prior COVID-19 infection, routine mask use, or overall vaccination rate in the county of residence. Conclusions and Relevance: This NC community-based observational study showed that the proportion of the cohort who self-report breakthrough SARS-CoV-2 infections was 7.3 events per 100,000 person-years. Younger adults, those vaccinated with J&J Ad26.COV2.S, and those residing in suburban or rural counties were at higher risk of breakthrough infections and should be targeted for additional risk mitigation strategies to decrease community transmission.


2021 ◽  
Vol 15 (7) ◽  
pp. e0009635
Author(s):  
Selma Regina Penha Silva Cerqueira ◽  
Patrícia Duarte Deps ◽  
Débora Vilela Cunha ◽  
Natanael Victor Furtunato Bezerra ◽  
Daniel Holanda Barroso ◽  
...  

Background Protective effects of Bacillus Calmette–Guérin (BCG) vaccination and clofazimine and dapsone treatment against severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection have been reported. Patients at risk for leprosy represent an interesting model for assessing the effects of these therapies on the occurrence and severity of coronavirus disease 2019 (COVID-19). We assessed the influence of leprosy-related variables in the occurrence and severity of COVID-19. Methodology/Principal findings We performed a 14-month prospective real-world cohort study in which the main risk factor was 2 previous vaccinations with BCG and the main outcome was COVID-19 detection by reverse transcription polymerase chain reaction (RT-PCR). A Cox proportional hazards model was used. Among the 406 included patients, 113 were diagnosed with leprosy. During follow-up, 69 (16.99%) patients contracted COVID-19. Survival analysis showed that leprosy was associated with COVID-19 (p<0.001), but multivariate analysis showed that only COVID-19-positive household contacts (hazard ratio (HR) = 8.04; 95% CI = 4.93–13.11) and diabetes mellitus (HR = 2.06; 95% CI = 1.04–4.06) were significant risk factors for COVID-19. Conclusions/Significance Leprosy patients are vulnerable to COVID-19 because they have more frequent contact with SARS-CoV-2-infected patients, possibly due to social and economic limitations. Our model showed that the use of corticosteroids, thalidomide, pentoxifylline, clofazimine, or dapsone or BCG vaccination did not affect the occurrence or severity of COVID-19.


Author(s):  
Natalia S Gavrilova ◽  
Leonid A Gavrilov

Abstract It is known that biological relatives of long-lived individuals demonstrate lower mortality and longer lifespan compared to relatives of shorter-lived individuals, and at least part of this advantage is likely to be genetic. Less information, however, is available about effects of familial longevity on age-specific mortality trajectories. We compared mortality patterns after age 50 years for 10,045 siblings of U.S. centenarians and 12,308 siblings of shorter-lived individuals (died at age 65 years). Similar comparisons were made for sons and daughters of longer-lived parents (both parents lived 80 years and more) and shorter-lived parents (both parents lived less than 80 years) within each group of siblings. Although relatives of longer-lived individuals have lower mortality at younger ages compared to relatives of shorter lived individuals, this mortality advantage practically disappears by age 100 years. To validate this observation further, we analyzed survival of 3,408 U.S. centenarians born in 1890-97 with known information on maternal and paternal lifespan. We found using the Cox proportional hazards model that both maternal and paternal longevity (lifespan 80+ years) is not significantly associated with survival after age 100 years. The results are compatible with the predictions of reliability theory of aging suggesting higher initial levels of system redundancy (reserves) in individuals with protective familial/genetic background and hence lower initial mortality. Heterogeneity hypothesis is another possible explanation for the observed phenomena.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Adrien Guilloteau ◽  
Michal Abrahamowicz ◽  
Olayide Boussari ◽  
Valérie Jooste ◽  
Thomas Aparicio ◽  
...  

Abstract Background As cancer treatment, biotherapies can be as effective as chemotherapy while reducing the risk of secondary effects, so that they can be taken over longer periods than conventional chemotherapy. Thus, some trials aimed at assessing the benefit of maintaining biotherapies during chemotherapy-free intervals (CFI). For example, the recent PRODIGE9 trial assessed the effect of maintaining bevacizumab during CFI in metastatic colorectal cancer (mCRC) patients. However, its analysis was hindered by a small difference of exposure to the treatment between the randomized groups and by a large proportion of early drop outs, leading to a potentially unbalanced distribution of confounding factors among the trial completers. To address these limitations, we re-analyzed the PRODIGE9 data to assess the effects of different exposure metrics on all-cause mortality of patients with mCRC using methods originally developed for observational studies. Methods To account for the actual patterns of drug use by individual patients and for possible cumulative effects, we used five alternative time-varying exposure metrics: (i) cumulative dose, (ii) quantiles of the cumulative dose, (iii) standardized cumulative dose, (iv) Theoretical Blood Concentration (TBC), and (v) Weighted Cumulative Exposure (WCE). The last two metrics account for the timing of drug use. Treatment effects were estimated using adjusted Hazard Ratio from multivariable Cox proportional hazards models. Results After excluding 112 patients who died during the induction period, we analyzed data on 382 patients, among whom 320 (83.8%) died. All time-varying exposures improved substantially the model’s fit to data, relative to using only the time-invariant randomization group. All exposures indicated a protective effect for higher cumulative bevacizumab doses. The best-fitting WCE and TBC models accounted for both the cumulative effects and the different impact of doses taken at different times. Conclusions All time-varying analyses, regardless of the exposure metric used, consistently suggested protective effects of higher cumulative bevacizumab doses. However, the results may partly reflect the presence of a confusion bias. Complementing the main ITT analysis of maintenance trials with an analysis of potential cumulative effects of treatment actually taken can provide new insights, but the results must be interpreted with caution because they do not benefit from the randomization. Trial registration clinicaltrials.gov, NCT00952029. Registered 8 August 2009.


Nutrients ◽  
2018 ◽  
Vol 11 (1) ◽  
pp. 31 ◽  
Author(s):  
Jimin Jeon ◽  
Jiyoung Jang ◽  
Kyong Park

The effect of calcium consumption in the prevention of type 2 diabetes mellitus (T2DM) remains controversial, and depends on food calcium sources. This prospective study aimed to evaluate the association between calcium-rich food consumption and T2DM incidence among Korean adults. We analyzed the data of 8574 adults aged 40–69 years, without a history of T2DM, cardiovascular disease, and cancer at the baseline from the Korean Genome and Epidemiology Study. The consumption of calcium-rich foods was assessed using a validated semi-quantitative food frequency questionnaire. T2DM-related data were collected using biennial questionnaires, health examinations, and clinical tests. Hazard ratios (HRs) and 95% confidence intervals (CIs) were calculated using Cox proportional hazards regression models. In the multivariate-adjusted model, yogurt intake was inversely associated with T2DM risk (HR: 0.73; 95% CI: 0.61–0.88 in the fourth quartile as compared to the first quartile). However, the intakes of other calcium-rich foods, including milk and anchovies, were not significantly associated with T2DM risk. Yogurt may provide protective effects against T2DM in Korean adults, owing to the beneficial effects of probiotics. Further prospective large-scale cohort studies should be conducted to validate these findings.


Nutrients ◽  
2019 ◽  
Vol 11 (7) ◽  
pp. 1504 ◽  
Author(s):  
Shi ◽  
Lv ◽  
Mao ◽  
Yuan ◽  
Yin ◽  
...  

In vitro and in vivo experimental studies have shown garlic has protective effects on the aging process; however, there is no evidence that garlic consumption is associated with all-cause mortality among oldest-old individuals (≥80 years). From 1998 to 2011, 27,437 oldest-old participants (mean age: 92.9 years) were recruited from 23 provinces in China. The frequencies of garlic consumption at baseline and at age 60 were collected. Cox proportional hazards models adjusted for potential covariates were constructed to estimate hazard ratios (HRs) relating garlic consumption to all-cause mortality. Among 92,505 person-years of follow-up from baseline to September 1, 2014, 22,321 participants died. Participants who often (≥5 times/week) or occasionally (1–4 times/week) consumed garlic survived longer than those who rarely (less than once/week) consumed it (p < 0.001). Participants who consumed garlic occasionally or often had a lower risk for mortality than those who rarely consumed garlic at baseline; the adjusted HRs for mortality were 0.92(0.89–0.94) and 0.89(0.85–0.92), respectively. The inverse associations between garlic consumption and all-cause mortality were robust in sensitivity analyses and subgroup analyses. In this study, habitual consumption of garlic was associated with a lower all-cause mortality risk; this advocates further investigation into garlic consumption for promoting longevity.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Eunhae Shin ◽  
Dong Hui Lim ◽  
Tae-Young Chung ◽  
Gyule Han ◽  
Jung Eun Yoo ◽  
...  

AbstractThis study is to elucidate the associations between female reproductive factors and pterygium. A total of 1,339,969 postmenopausal women in a retrospective cohort of Korean National Health Insurance Service data on ages 40 and above in 2009 was included. Cox proportional hazards regression was conducted to assess the hazard ratio (HR) for pterygium according to reproductive factors. Late menarche, early menopause, short reproductive period, increasing parity (≥ 2 children), breastfeeding (≥ 6 months), and no use of hormone replacement therapy (HRT) or oral contraceptive (OC) were significantly associated with risk of pterygium. In multivariate analysis, the HR for pterygium was 1.764 (95% confidence interval [CI], 1.529–2.035) for menarche age ≥ 17 years (reference: menarche age < 12 years). The HR of menopause age ≥ 55 years was 0.782 (95% CI, 0.724–0.845) (reference: menopause age < 40 years). The HR of parity ≥ 2 was 1.261 (95% CI, 1.148–1.385) (reference: nulliparity). The HR of breastfeeding ≥ 1 year was 1.663 (95% CI, 1.564–1.768) (reference: no breastfeeding). The HRs of HRT and OC use for any length of time were lower than those for the non-user groups (reference). Reproductive factors that increase estrogen exposure have protective effects against pterygium in females.


2016 ◽  
Vol 43 (8) ◽  
pp. 1503-1509 ◽  
Author(s):  
Lisa J. Herrinton ◽  
Liyan Liu ◽  
Robert Goldfien ◽  
M. Alex Michaels ◽  
Trung N. Tran

Objective.To compare serious infection risk for systemic lupus erythematosus (SLE) patients starting glucocorticoids (GC), antimalarials (AM), or their combination.Methods.We conducted a new-user, historical cohort study, Kaiser Permanente Northern California, 1997–2013. Cox proportional hazards analysis was used to calculate adjusted HR and 95% CI.Results.The study included 3030 patients with SLE followed an average of 4 years. Compared with patients starting AM without GC (9 infections/1461 patient-yrs), the HR for the risk of infection was 3.9 (95% CI 1.7–9.2) for those starting GC ≤ 15 mg/day without AM (14 infections/252 patient-yrs), while it was 0.0 (0 infections/128 patient-yrs) for those starting the combination. We split the 14 patients with a serious infection and with GC < 15 mg/day into 2 groups: < 7.5 and ≥ 7.5–15 mg/day. The HR for < 7.5 mg/day was 4.6 (95% CI 1.8–11.4) and for ≥ 7.5–15 mg/day, 3.1 (95% CI 1.0–9.7). For patients starting GC > 15 mg/day (reflecting more severe SLE), the risk of infection was nearly the same for the combination of GC and AM (9 infections/135 patient-yrs) and GC alone (41 infections/460 patient-yrs), but the combination users had evidence of more severe disease. Patients with SLE had a 6- to 7-fold greater risk of serious infection than the general population.Conclusion.Our findings suggest that the benefits of AM treatment for SLE may extend to preventing serious infections. Although the study included > 3000 patients, the statistical power to examine GC dosages < 15 mg/day was poor.


2021 ◽  
Author(s):  
Yahya Pasdar ◽  
Behrooz Hamzeh ◽  
Shima Moradi ◽  
Ehsan Mohammadi ◽  
Mitra Darbandi ◽  
...  

Abstract Background: Since hypertension (HTN) is responsible for more than half of all deaths from cardiovascular disease, it is important to know the nutritional factors that reduce its risk. Although little information is known about it in the Kurdish population. This study was aimed to evaluate healthy eating index (HEI) 2015 and major dietary patterns in relation to incident HTN. Methods: This case- cohort study was designed using data from Ravansar non- communicable diseases (RaNCD) cohort study (294 participants with incident HTN and 1295 participants as representative random sub-cohort). HEI 2015 and major dietary patterns were extracted using data from their dietary intake and three major dietary patterns were identified including plant- based, high protein, and unhealthy dietary patterns. To analysis of association between HEI 2015 and major dietary patterns with incident HTN Cox proportional hazards regression models were applied. Results: There was a positive significant correlation between HEI 2015 and plant- based diet (r=0.492). The participants in the highest quartile of HEI-2015 had a 39% lower risk of incident HTN, compared to participants in first quartile in both crude and adjusted model (HR: 0.61; 95% CI: 0.46-0.82) and (HR: 0.7; 95% CI: 0.51–0.97), respectively. Furthermore, participants who were the highest tertile of plant- based dietary pattern were lower risk of incident HTN in both crude and adjusted models (HR: 0.69; 95% CI: 0.54–0.9) and (HR: 0.7; 95% CI: 0.53–0.94), respectively. However, other two identified dietary patterns had no significant association with incident HTN. Conclusions: We found evidence indicating higher adherence to HEI 2015 and plant- based diet had protective effects on incident HTN. The HEI 2015 emphasizes limited sodium intake and adequate intake of vegetables and fruits.


Sign in / Sign up

Export Citation Format

Share Document