scholarly journals 472Long-term effectiveness of 3-dose primary course and 4-year booster dose of pertussis vaccine in Australia

2021 ◽  
Vol 50 (Supplement_1) ◽  
Author(s):  
Duleepa Jayasundara ◽  
Sarah Sheridan ◽  
Deborah Randall ◽  
Patricia Campbell ◽  
Karen Edmond ◽  
...  

Abstract Background Australia’s National Immunisation Program recommended a 3-dose primary Diphtheria-Tetanus-Pertussis (DTP) vaccination course at 2, 4 and 6 months and a booster dose at 4 years during 2003-2015. We examined vaccine effectiveness by time since doses 3 and 4, as studies to date have shown conflicting results. Methods Perinatal, immunisation, pertussis notification and death data were linked for 1,086,319 infants born in two Australian states in 2003-2012. Administration of DTP doses 3 and 4 from 5.5-7 months and 47-53 months respectively, was considered age-appropriate. Adjusted Cox proportional hazards models with time-varying vaccination status were used to estimate vaccine effectiveness (VE = 1–hazard ratio) against notified pertussis post age-appropriate doses 3 and 4 compared to unvaccinated children, with additional benefit of dose 4 compared to receipt of primary course alone. Results Dose 3 VE declined from 79% (CI 75%-83%) from 0-6 months to 64% (CI 60%-67%) at 6-36 months and 45% (CI 31%-56%) at 36-42 months post-vaccination. Compared to unvaccinated children, VE after dose 4 declined from 83% (CI 80%-86%) at 0-12 months to 67% (CI 60%-72%) and 55% (CI 46%-63%) in the following two 12-month periods post-vaccination. When compared to dose 3, the relative VE for dose 4 was 58% (CI 51%-64%) in 0-18 months post-vaccination. Conclusion and Key messages Our study adds to previous Australian evidence for substantial waning of vaccine induced immunity against pertussis over a 3-year period following dose 3. VE was significantly higher in the 18 months following dose 4 compared to receipt of primary course alone.

2021 ◽  
Author(s):  
Ana Florea ◽  
Lina S. Sy ◽  
Yi Luo ◽  
Lei Qian ◽  
Katia J. Bruxvoort ◽  
...  

Background: We conducted a prospective cohort study at Kaiser Permanente Southern California to study the vaccine effectiveness (VE) of mRNA-1273 over time and during the emergence of the Delta variant. Methods: The cohort for this planned interim analysis consisted of individuals aged ≥18 years receiving 2 doses of mRNA-1273 through June 2021, matched 1:1 to randomly selected unvaccinated individuals by age, sex, and race/ethnicity, with follow-up through September 2021. Outcomes were SARS-CoV-2 infection, and COVID-19 hospitalization and hospital death. Cox proportional hazards models were used to estimate adjusted hazard ratios (aHR) with 95% confidence intervals (CIs) comparing outcomes in the vaccinated and unvaccinated groups. Adjusted VE (%) was calculated as (1-aHR)x100. HRs and VEs were also estimated for SARS-CoV-2 infection by age, sex, race/ethnicity, and during the Delta period (June-September 2021). VE against SARS-CoV-2 infection and COVID-19 hospitalization was estimated at 0-<2, 2-<4, 4-<6, and 6-<8 months post-vaccination. Results: 927,004 recipients of 2 doses of mRNA-1273 were matched to 927,004 unvaccinated individuals. VE (95% CI) was 82.8% (82.2-83.3%) against SARS-CoV-2 infection, 96.1% (95.5-96.6%) against COVID-19 hospitalization, and 97.2% (94.8-98.4%) against COVID-19 hospital death. VE against SARS-CoV-2 infection was similar by age, sex, and race/ethnicity, and was 86.5% (84.8-88.0%) during the Delta period. VE against SARS-CoV-2 infection decreased from 88.0% at 0-<2 months to 75.5% at 6-<8 months. Conclusions: These interim results provide continued evidence for protection of 2 doses of mRNA-1273 against SARS-CoV-2 infection over 8 months post-vaccination and during the Delta period, and against COVID-19 hospitalization and hospital death.


JAMIA Open ◽  
2020 ◽  
Author(s):  
Spiros Denaxas ◽  
Anoop D Shah ◽  
Bilal A Mateen ◽  
Valerie Kuan ◽  
Jennifer K Quint ◽  
...  

Abstract Objectives The UK Biobank (UKB) is making primary care electronic health records (EHRs) for 500 000 participants available for COVID-19-related research. Data are extracted from four sources, recorded using five clinical terminologies and stored in different schemas. The aims of our research were to: (a) develop a semi-supervised approach for bootstrapping EHR phenotyping algorithms in UKB EHR, and (b) to evaluate our approach by implementing and evaluating phenotypes for 31 common biomarkers. Materials and Methods We describe an algorithmic approach to phenotyping biomarkers in primary care EHR involving (a) bootstrapping definitions using existing phenotypes, (b) excluding generic, rare, or semantically distant terms, (c) forward-mapping terminology terms, (d) expert review, and (e) data extraction. We evaluated the phenotypes by assessing the ability to reproduce known epidemiological associations with all-cause mortality using Cox proportional hazards models. Results We created and evaluated phenotyping algorithms for 31 biomarkers many of which are directly related to COVID-19 complications, for example diabetes, cardiovascular disease, respiratory disease. Our algorithm identified 1651 Read v2 and Clinical Terms Version 3 terms and automatically excluded 1228 terms. Clinical review excluded 103 terms and included 44 terms, resulting in 364 terms for data extraction (sensitivity 0.89, specificity 0.92). We extracted 38 190 682 events and identified 220 978 participants with at least one biomarker measured. Discussion and conclusion Bootstrapping phenotyping algorithms from similar EHR can potentially address pre-existing methodological concerns that undermine the outputs of biomarker discovery pipelines and provide research-quality phenotyping algorithms.


Cancers ◽  
2021 ◽  
Vol 13 (5) ◽  
pp. 1177
Author(s):  
In Young Choi ◽  
Sohyun Chun ◽  
Dong Wook Shin ◽  
Kyungdo Han ◽  
Keun Hye Jeon ◽  
...  

Objective: To our knowledge, no studies have yet looked at how the risk of developing breast cancer (BC) varies with changes in metabolic syndrome (MetS) status. This study aimed to investigate the association between changes in MetS and subsequent BC occurrence. Research Design and Methods: We enrolled 930,055 postmenopausal women aged 40–74 years who participated in a biennial National Health Screening Program in 2009–2010 and 2011–2012. Participants were categorized into four groups according to change in MetS status during the two-year interval screening: sustained non-MetS, transition to MetS, transition to non-MetS, and sustained MetS. We calculated multivariable-adjusted hazard ratios (aHRs) and 95% confidence intervals (CIs) for BC incidence using the Cox proportional hazards models. Results: At baseline, MetS was associated with a significantly increased risk of BC (aHR 1.11, 95% CI 1.06–1.17) and so were all of its components. The risk of BC increased as the number of the components increased (aHR 1.46, 95% CI 1.26–1.61 for women with all five components). Compared to the sustained non-MetS group, the aHR (95% CI) for BC was 1.11 (1.04–1.19) in the transition to MetS group, 1.05 (0.96–1.14) in the transition to non-MetS group, and 1.18 (1.12–1.25) in the sustained MetS group. Conclusions: Significantly increased BC risk was observed in the sustained MetS and transition to MetS groups. These findings are clinically meaningful in that efforts to recover from MetS may lead to reduced risk of BC.


Author(s):  
Laurie Grieshober ◽  
Stefan Graw ◽  
Matt J. Barnett ◽  
Gary E. Goodman ◽  
Chu Chen ◽  
...  

Abstract Purpose The neutrophil-to-lymphocyte ratio (NLR) is a marker of systemic inflammation that has been reported to be associated with survival after chronic disease diagnoses, including lung cancer. We hypothesized that the inflammatory profile reflected by pre-diagnosis NLR, rather than the well-studied pre-treatment NLR at diagnosis, may be associated with increased mortality after lung cancer is diagnosed in high-risk heavy smokers. Methods We examined associations between pre-diagnosis methylation-derived NLR (mdNLR) and lung cancer-specific and all-cause mortality in 279 non-small lung cancer (NSCLC) and 81 small cell lung cancer (SCLC) cases from the β-Carotene and Retinol Efficacy Trial (CARET). Cox proportional hazards models were adjusted for age, sex, smoking status, pack years, and time between blood draw and diagnosis, and stratified by stage of disease. Models were run separately by histotype. Results Among SCLC cases, those with pre-diagnosis mdNLR in the highest quartile had 2.5-fold increased mortality compared to those in the lowest quartile. For each unit increase in pre-diagnosis mdNLR, we observed 22–23% increased mortality (SCLC-specific hazard ratio [HR] = 1.23, 95% confidence interval [CI]: 1.02, 1.48; all-cause HR = 1.22, 95% CI 1.01, 1.46). SCLC associations were strongest for current smokers at blood draw (Interaction Ps = 0.03). Increasing mdNLR was not associated with mortality among NSCLC overall, nor within adenocarcinoma (N = 148) or squamous cell carcinoma (N = 115) case groups. Conclusion Our findings suggest that increased mdNLR, representing a systemic inflammatory profile on average 4.5 years before a SCLC diagnosis, may be associated with mortality in heavy smokers who go on to develop SCLC but not NSCLC.


2020 ◽  
pp. 073346482096720
Author(s):  
Woojung Lee ◽  
Shelly L. Gray ◽  
Douglas Barthold ◽  
Donovan T. Maust ◽  
Zachary A. Marcum

Informants’ reports can be useful in screening patients for future risk of dementia. We aimed to determine whether informant-reported sleep disturbance is associated with incident dementia, whether this association varies by baseline cognitive level and whether the severity of informant-reported sleep disturbance is associated with incident dementia among those with sleep disturbance. A longitudinal retrospective cohort study was conducted using the uniform data set collected by the National Alzheimer’s Coordinating Center. Older adults without dementia at baseline living with informants were included in analysis. Cox proportional hazards models showed that participants with an informant-reported sleep disturbance were more likely to develop dementia, although this association may be specific for older adults with normal cognition. In addition, older adults with more severe sleep disturbance had a higher risk of incident dementia than those with mild sleep disturbance. Informant-reported information on sleep quality may be useful for prompting cognitive screening.


2021 ◽  
pp. 000348942110081
Author(s):  
Sara Behbahani ◽  
Gregory L. Barinsky ◽  
David Wassef ◽  
Boris Paskhover ◽  
Rachel Kaye

Objective: Primary tracheal malignancies are relatively rare cancers, representing 0.1% to 0.4% of all malignancies. Adenoid cystic carcinoma (ACC) is the second most common histology of primary tracheal malignancy, after squamous cell carcinoma. This study aims to analyze demographic characteristics and potential influencing factors on survival of tracheal ACC (TACC). Methods: This was a retrospective cohort study utilizing the National Cancer Database (NCDB). The NCDB was queried for all cases of TACC diagnosed from 2004 to 2016 (n = 394). Kaplan-Meier (KM) and Cox proportional-hazards models were used to determine clinicopathological and treatment factors associated with survival outcomes. Results: Median age of diagnosis was 56 (IQR: 44.75-66.00). Females were affected slightly more than males (53.8% vs 46.2%). The most prevalent tumor diameter range was 20 to 39 mm (34.8%) followed by greater than 40 mm in diameter (17.8%). Median overall survival (OS) was 9.72 years with a 5- and 10-year OS of 70% and 47.5%, respectively. Localized disease was not associated with a survival benefit over invasive disease ( P = .388). The most common intervention was surgery combined with radiation therapy (RT) at 46.2%, followed by surgery alone (16.8%), and standalone RT (8.9%). When adjusting for confounders, surgical resection was independently associated with improved OS (HR 0.461, 95% CI 0.225-0.946). Tumor size greater than 40 mm was independently associated with worse OS (HR 2.808; 95% CI 1.096-7.194). Conclusion: Our data suggests that surgical resection, possibly in conjunction with radiation therapy, is associated with improved survival, and tumor larger than 40 mm are associated with worse survival.


Author(s):  
Anne Høye ◽  
Bjarne K. Jacobsen ◽  
Jørgen G. Bramness ◽  
Ragnar Nesvåg ◽  
Ted Reichborn-Kjennerud ◽  
...  

Abstract Purpose To investigate the mortality in both in- and outpatients with personality disorders (PD), and to explore the association between mortality and comorbid substance use disorder (SUD) or severe mental illness (SMI). Methods All residents admitted to Norwegian in- and outpatient specialist health care services during 2009–2015 with a PD diagnosis were included. Standardized mortality ratios (SMRs) with 95% confidence intervals (CI) were estimated in patients with PD only and in patients with PD and comorbid SMI or SUD. Cox proportional hazards models were used to estimate adjusted hazard ratios (HRs) with 95% CIs in patients with PD and comorbid SMI or SUD compared to patients with PD only. Results Mortality was increased in both in- and outpatients with PD. The overall SMR was 3.8 (95% CI 3.6–4.0). The highest SMR was estimated for unnatural causes of death (11.0, 95% CI 10.0–12.0), but increased also for natural causes of death (2.2, 95% CI 2.0–2.5). Comorbidity was associated with higher SMRs, particularly due to poisoning and suicide. Patients with comorbid PD & SUD had almost four times higher all-cause mortality HR than patients with PD only; young women had the highest HR. Conclusion The SMR was high in both in- and outpatients with PD, and particularly high in patients with comorbid PD & SUD. Young female patients with PD & SUD were at highest risk. The higher mortality in patients with PD cannot, however, fully be accounted for by comorbidity.


Author(s):  
Cilie C. van ’t Klooster ◽  
◽  
Yolanda van der Graaf ◽  
Hendrik M. Nathoe ◽  
Michiel L. Bots ◽  
...  

AbstractThe purpose is to investigate the added prognostic value of coronary artery calcium (CAC), thoracic aortic calcium (TAC), and heart valve calcium scores for prediction of a combined endpoint of recurrent major cardiovascular events and cardiovascular interventions (MACE +) in patients with established cardiovascular disease (CVD). In total, 567 patients with established CVD enrolled in a substudy of the UCC-SMART cohort, entailing cardiovascular CT imaging and calcium scoring, were studied. Five Cox proportional hazards models for prediction of 4-year risk of MACE + were developed; traditional CVD risk predictors only (model I), with addition of CAC (model II), TAC (model III), heart valve calcium (model IV), and all calcium scores (model V). Bootstrapping was performed to account for optimism. During a median follow-up of 3.43 years (IQR 2.28–4.74) 77 events occurred (MACE+). Calibration of predicted versus observed 4-year risk for model I without calcium scores was good, and the c-statistic was 0.65 (95%CI 0.59–0.72). Calibration for models II–V was similar to model I, and c-statistics were 0.67, 0.65, 0.65, and 0.68 for model II, III, IV, and V, respectively. NRIs showed improvement in risk classification by model II (NRI 15.24% (95%CI 0.59–29.39)) and model V (NRI 20.00% (95%CI 5.59–34.92)), but no improvement for models III and IV. In patients with established CVD, addition of the CAC score improved performance of a risk prediction model with classical risk factors for the prediction of the combined endpoint MACE+ . Addition of the TAC or heart valve score did not improve risk predictions.


Author(s):  
Katherine R Sabourin ◽  
Ibrahim Daud ◽  
Sidney Ogolla ◽  
Nazzarena Labo ◽  
Wendell Miley ◽  
...  

Abstract Background We aimed to determine whether Plasmodium falciparum (Pf) infection affects age of Kaposi sarcoma-associated herpesvirus (KSHV) seroconversion in Kenyan children. Methods Kenyan children (n=144) enrolled at age one month, from two sites with different levels of malaria transmission (stable/high malaria vs. unstable/low malaria transmission) were followed through 24 months. Plasma was tested for KSHV antibodies using enzyme-linked immunosorbent assay (ELISA) (K8.1 and LANA) and a multiplex bead-based assay (K8.1, K10.5, ORF38, ORF50, and LANA) and whole blood tested for Pf DNA using quantitative-PCR. Cox proportional hazards models were used to assess associations between Pf DNA detection, malaria annualized rate (Pf detections/person-years), and enrollment site (malaria-high vs malaria-low) with time to KSHV seroconversion. Results KSHV seroprevalence was 63% by 2 years of age when assessed by multiplex assay. Children with Pf were at increased hazards of earlier KSHV seroconversion and among children with malaria, the hazard of becoming KSHV seropositive increased significantly with increasing malaria annualized rate. Children from the malaria-high transmission region had no significant difference in hazards of KSHV seroconversion at 12 months but were more likely to become KSHV seropositive by 24 months of age. Discussion Malaria exposure increases the risk for KSHV seroconversion early in life.


2020 ◽  
Vol 5 (4) ◽  
pp. 598-616 ◽  
Author(s):  
Austin C Doctor

Abstract Why do rebel organizations splinter into competing factions during civil war? To explain this outcome, I leverage variation in rebel leadership. I argue that rebel leaders draw on their pre-war experiences—i.e., their military and political experiences—to manage their organizations during conflict. These experiences bear unique patterns of rebel management and, thus, corresponding risks of fragmentation. Empirical evidence comes from a two-stage research design and original data featuring over 200 rebel leaders from 1989 to 2014. In the first stage, I estimate the probability of group fragmentation with a series of logistic regression models. In the second stage, I use Cox proportional-hazards models to estimate leadership effects on the rate of group fragmentation. Results indicate that variation in rebel leadership corresponds with unique risks of fragmentation. In particular, the results suggest that leaders with real military experience are best equipped to maintain group cohesion. This study offers insight into the processes by which rebel groups splinter into armed factions. In addition, it makes an important contribution to the broader discussion on the roles of structure and agency in shaping the dynamics of civil war.


Sign in / Sign up

Export Citation Format

Share Document