scholarly journals Covid-19 Vaccine Effectiveness in Healthcare Personnel in six Israeli Hospitals (CoVEHPI)

Author(s):  
Mark A. Katz ◽  
Efrat Bron Harlev ◽  
Bibiana Chazan ◽  
Michal Chowers ◽  
David Greenberg ◽  
...  

Background Methodologically rigorous studies on Covid-19 vaccine effectiveness (VE) in preventing SARS-CoV-2 infection are critically needed to inform national and global policy on Covid-19 vaccine use. In Israel, healthcare personnel (HCP) were initially prioritized for Covid-19 vaccination, creating an ideal setting to evaluate real-world VE in a closely monitored population. Methods We conducted a prospective study among HCP in 6 hospitals to estimate the effectiveness of the BNT162b2 mRNA Covid-19 vaccine in preventing SARS-CoV-2 infection. Participants filled out weekly symptom questionnaires, provided weekly nasal specimens, and three serology samples - at enrollment, 30 days and 90 days. We estimated VE against PCR-confirmed SARS-CoV-2 infection using the Cox Proportional Hazards model and against a combined PCR/serology endpoint using Fishers exact test. Findings Of the 1,567 HCP enrolled between December 27, 2020 and February 15, 2021, 1,250 previously uninfected participants were included in the primary analysis; 998 (79.8%) were vaccinated with their first dose prior to or at enrollment, all with Pfizer BNT162b2 mRNA vaccine. There were four PCR-positive events among vaccinated participants, and nine among unvaccinated participants. Adjusted two-dose VE against any PCR-confirmed infection was 94.5% (95% CI: 82.6%-98.2%); adjusted two-dose VE against a combined endpoint of PCR and seroconversion for a 60-day follow-up period was 94.5% (95% CI: 63.0%-99.0%). Five PCR-positive samples from study participants were sequenced; all were alpha variant. Interpretation Our prospective VE study of HCP in Israel with rigorous weekly surveillance found very high VE for two doses of Pfizer BNT162b2 mRNA vaccine against SARS-CoV-2 during a period of predominant alpha variant circulation.

BMC Cancer ◽  
2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Lisa M. Hess ◽  
Yimei Han ◽  
Yajun Emily Zhu ◽  
Naleen Raj Bhandari ◽  
Anthony Sireci

Abstract Background Contradictory and limited data are available about the presentation and outcomes of patients with RET-fusion positive metastatic NSCLC as compared to patients without RET fusions. This observational study utilizing a linked electronic health records (EHR) database to genomics testing results was designed to compare characteristics, tumor response, progression-free (PFS) and overall survival (OS) outcomes by RET fusion status among patients with metastatic NSCLC treated with standard therapies. Methods Adult patients with metastatic NSCLC with linked EHR and genomics data were eligible who received systemic anti-cancer therapy on or after January 1, 2011. Adjusted, using all available baseline covariates, and unadjusted analyses were conducted to compare tumor response, PFS and OS between patients with RET-fusion positive and RET-fusion negative disease as detected by next-generation sequencing. Tumor response outcomes were analysed using Fisher’s exact test, and time-to-event analyses were conducted using Cox proportional hazards model. Results There were 5807 eligible patients identified (RET+ cohort, N = 46; RET- cohort, N = 5761). Patients with RET fusions were younger, more likely to have non-squamous disease and be non-smokers and had better performance status (all p < 0.01). In unadjusted analyses, there were no significant differences in tumor response (p = 0.17) or PFS (p = 0.06) but OS was significantly different by RET status (hazard ratio, HR = 1.91, 95% CI:1.22–3.0, p = 0.005). There were no statistically significant differences by RET fusion status in adjusted analyses of either PFS or OS (PFS HR = 1.24, 95% CI:0.86–1.78, p = 0.25; OS HR = 1.52, 95% CI: 0.95–2.43, p = 0.08). Conclusions Patients with RET fusions have different baseline characteristics that contribute to favorable OS in unadjusted analysis. However, after adjusting for baseline covariates, there were no significant differences in either OS or PFS by RET status among patients treated with standard therapy prior to the availability of selective RET inhibitors.


2021 ◽  
Vol 50 (Supplement_1) ◽  
Author(s):  
Yuri Yokoyama ◽  
Akihiko Kitamura ◽  
Yu Nofuji ◽  
Satoshi Seino ◽  
Hidenori Amano ◽  
...  

Abstract Background Although consuming a variety of foods is an internationally accepted recommendation for a healthy diet, little is known about the association between dietary variety and incident dementia. This study aimed to examine the association between dietary variety and incident disabling dementia in community-dwelling elderly Japanese adults. Methods We conducted a prospective study of 721 participants (age range: 65–97 years) of the 2012–2013 Kusatsu Longitudinal Study. Dietary variety was assessed based on a food frequency questionnaire that encompassed the 10 main food components of Japanese meals: meat, fish/shellfish, eggs, milk, soybean products, green/yellow vegetables, potatoes, fruit, seaweed, and fats/oils). Participants were then categorized into low (0–2 points), middle (3–5 points), and high (6–10 points) groups based on the scores. Data regarding incident disabling dementia were retrieved from the public Long-term Care Insurance database. The Cox proportional hazards model was used to estimate the hazard ratios (HRs) with 95% confidence intervals (CIs). Results During the median follow-up of 6.5 years, the incidence of disabling dementia was 9.3%. After adjusting for confounders, the multivariate HR for incident disabling dementia was 0.52 (95% CI, 0.27–1.00) for participants in the highest category of the dietary variety score compared to that for those in the lowest category. Conclusions Greater dietary variety is associated with a reduced risk of incident disabling dementia in elderly Japanese adults. Key messages Consuming a variety of foods may be necessary for dementia prevention.


2009 ◽  
Vol 27 (15_suppl) ◽  
pp. e13529-e13529
Author(s):  
A. Bharthuar ◽  
T. Khoury ◽  
K. N. Haas ◽  
T. Mashtare ◽  
J. Black ◽  
...  

e13529 Background: EC patients face a dismal outcome despite tri-modality management strategy. Median survival remains 15–18 months despite platinum, fluropyrimidine & irinotecan based therapy. BCRP is an ATP-dependent efflux protein associated with chemotherapy (CT) (e.g. irinotecan) resistance. Role of BCRP expression in EC and normal esophageal cells is not known. We examined the expression of this protein and correlate it with survival (OS) in patients receiving irinotecan-based CT. Methods: With IRB approval, 40 cases of EC diagnosed between 2004 and 2008 were stained for BCRP expression by IHC & scored by the pathologist blinded to clinical data. Baseline demographics, therapy given & OS data were collected and correlated with BCRP expression. BCRP score (membrane or cytoplasm) >/= 30 was considered positive (calculated by multiplying BCRP intensity and % staining). Fisher's exact test used to determine association between BCRP expression & demographics. Cox proportional hazards model used for association of BCRP & OS. Results: Baseline patient and tumor characteristics: Gender: M 35, F 5; Histology: 37 Adenoca & 3 SCC; Stage 1-III 27, Stage IV 10, unknown 3; CT: cisplatin+irinotecan (n=16), oxaliplatin+fluoropyrimidine (n=8), other (n=16); IHC: 30 of 40 cancers (75%) expressed BCRP [strong (n=28) & intermediate (n=3); membranous (n=17), cytoplasmic (n=27) & both (n=14)]. Down-regulation of BCRP expression in tumor compared to normal cells seen in 40% of patients. Median OS was 19 months with no difference in OS between BCRP positive and negative patients (p=0.13). Estimated hazard ratio (HR) of death for BCRP positive patients was 2.29 (95% CI 0.79 - 6.64).There was no association between BCRP expression and stage, age, gender or histology. For patients who received cisplatin and irinotecan as first line CT there was no difference in OS (p=0.39) of BCRP negative versus positive patients. Conclusions: BCRP expression is seen in a majority of EC & normal esophageal mucosa. Response rates to irinotecan based therapies are seen in 30–40 % of EC, whether the 40% with low tumor BCRP constitute a majority of the responders needs to be prospectively validated in a larger dataset & should include markers that predict response to 5-FU & platinum based CT to allow individualizing therapy for this cancer. No significant financial relationships to disclose.


2021 ◽  
pp. jech-2020-214821
Author(s):  
Yun Chen ◽  
Na Wang ◽  
Xiaolian Dong ◽  
Xuecai Wang ◽  
Jianfu Zhu ◽  
...  

BackgroundTo assess the associations of body mass index (BMI) with all-cause and cause-specific mortalities among rural Chinese.MethodsA prospective study of 28 895 individuals was conducted from 2006 to 2014 in rural Deqing, China. Height and weight were measured. The association of BMI with mortality was assessed by using Cox proportional hazards model and restricted cubic spline regression.ResultsThere were a total of 2062 deaths during an average follow-up of 7 years. As compared with those with BMI of 22.0–24.9 kg/m2, an increased risk of all-cause mortality was found for both underweight men (BMI <18.5 kg/m2) (adjusted HR (aHR): 1.45, 95% CI: 1.18 to 1.79) and low normal weight men (BMI of 18.5–21.9 kg/m2) (aHR: 1.20, 95% CI: 1.03 to 1.38). A J-shaped association was observed between BMI and all-cause mortality in men. Underweight also had an increased risk of cardiovascular disease and cancer mortalities in men. The association of underweight with all-cause mortality was more pronounced in ever smokers and older men (60+ years). The results remained after excluding participants who were followed up less than 1 year.ConclusionThe present study suggests that underweight is an important predictor of mortality, especially for elderly men in the rural community of China.


2018 ◽  
Vol 36 (6_suppl) ◽  
pp. 447-447
Author(s):  
Prateek Mendiratta ◽  
Andrea Loehr ◽  
Andrew Simmons ◽  
Pedro C. Barata ◽  
Stefan Klek ◽  
...  

447 Background: Deleterious GA in genes of the HRR pathway and tumor mutational load (TML; mutations/Mb) were shown to predict response to PBT and ICI; further validation can be informative. We assessed the predictive role of such GA in mUC. Methods: Tissue from mUC pts treated with PBT or ICI in the 1st line setting underwent genomic profiling (GP) via FoundationOne. Pts were analyzed in 2 groups based on the presence of potentially function-impairing GA (using classification criteria) in any of 15 pre-selected HRR genes. Exploratory assessment of overall response rate (ORR; RECIST v1.1), progression-free and overall survival (PFS, OS) based on presence of relevant GA was performed using Cox proportional hazards model, Kaplan Meier estimates, and Fisher’s exact test. Results: GA were noted in 22% of 88 identified mUC pts with available GP from 2012 to 2017. The most common deleterious GA were BRCA1/2 (n=6), ATM (n=6), CDK12 (n=2), BRIP1 (n=2), BARD1 (n=1), RAD51 (n=1), and CHEK2 (n=1). Of 88 pts, 62 were treated in the 1st line setting (median age 69; 27% women; 42% never smokers). Of these 62 pts, 42 received PBT and 20 ICI. Deleterious GAs were noted in ≥1 HR gene in 24% and 10% of pts in each group, respectively. The ORR was 40% and 43% in PBT pts with and without GA in any HRR gene, respectively. Analysis showed a median OS (10.6 vs 14.3 months, p=0.11), median PFS (6.1 vs 7.9 months, p=0.05), and no difference in the rate of responders vs non-responders (p=0.53) to PBT in pts with vs without GA in HRR genes. Analysis of ICI treated pts was not feasible (only 2 had GA in HRR genes). Median TML was 8 and 10 in pts with available data treated with PBT and ICI, respectively. There was no correlation between TML and response to either 1st line therapy (analysis underpowered). Of pts with GA in HR genes, the one with the longest OS had 2 GA (CDK12; FANCA). Conclusions: Deleterious GAs in genes of HRR pathway are frequent in mUC supporting TCGA and other datasets but did not confer sensitivity to 1st line PBT in our relatively small cohort. Further biomarker validation combined with LOH assessment can inform decision making and clinical trial designs.


Crisis ◽  
2018 ◽  
Vol 39 (1) ◽  
pp. 27-36 ◽  
Author(s):  
Kuan-Ying Lee ◽  
Chung-Yi Li ◽  
Kun-Chia Chang ◽  
Tsung-Hsueh Lu ◽  
Ying-Yeh Chen

Abstract. Background: We investigated the age at exposure to parental suicide and the risk of subsequent suicide completion in young people. The impact of parental and offspring sex was also examined. Method: Using a cohort study design, we linked Taiwan's Birth Registry (1978–1997) with Taiwan's Death Registry (1985–2009) and identified 40,249 children who had experienced maternal suicide (n = 14,431), paternal suicide (n = 26,887), or the suicide of both parents (n = 281). Each exposed child was matched to 10 children of the same sex and birth year whose parents were still alive. This yielded a total of 398,081 children for our non-exposed cohort. A Cox proportional hazards model was used to compare the suicide risk of the exposed and non-exposed groups. Results: Compared with the non-exposed group, offspring who were exposed to parental suicide were 3.91 times (95% confidence interval [CI] = 3.10–4.92 more likely to die by suicide after adjusting for baseline characteristics. The risk of suicide seemed to be lower in older male offspring (HR = 3.94, 95% CI = 2.57–6.06), but higher in older female offspring (HR = 5.30, 95% CI = 3.05–9.22). Stratified analyses based on parental sex revealed similar patterns as the combined analysis. Limitations: As only register-­based data were used, we were not able to explore the impact of variables not contained in the data set, such as the role of mental illness. Conclusion: Our findings suggest a prominent elevation in the risk of suicide among offspring who lost their parents to suicide. The risk elevation differed according to the sex of the afflicted offspring as well as to their age at exposure.


2020 ◽  
Vol 132 (4) ◽  
pp. 998-1005 ◽  
Author(s):  
Haihui Jiang ◽  
Yong Cui ◽  
Xiang Liu ◽  
Xiaohui Ren ◽  
Mingxiao Li ◽  
...  

OBJECTIVEThe aim of this study was to investigate the relationship between extent of resection (EOR) and survival in terms of clinical, molecular, and radiological factors in high-grade astrocytoma (HGA).METHODSClinical and radiological data from 585 cases of molecularly defined HGA were reviewed. In each case, the EOR was evaluated twice: once according to contrast-enhanced T1-weighted images (CE-T1WI) and once according to fluid attenuated inversion recovery (FLAIR) images. The ratio of the volume of the region of abnormality in CE-T1WI to that in FLAIR images (VFLAIR/VCE-T1WI) was calculated and a receiver operating characteristic curve was used to determine the optimal cutoff value for that ratio. Univariate and multivariate analyses were performed to identify the prognostic value of each factor.RESULTSBoth the EOR evaluated from CE-T1WI and the EOR evaluated from FLAIR could divide the whole cohort into 4 subgroups with different survival outcomes (p < 0.001). Cases were stratified into 2 subtypes based on VFLAIR/VCE-T1WIwith a cutoff of 10: a proliferation-dominant subtype and a diffusion-dominant subtype. Kaplan-Meier analysis showed a significant survival advantage for the proliferation-dominant subtype (p < 0.0001). The prognostic implication has been further confirmed in the Cox proportional hazards model (HR 1.105, 95% CI 1.078–1.134, p < 0.0001). The survival of patients with proliferation-dominant HGA was significantly prolonged in association with extensive resection of the FLAIR abnormality region beyond contrast-enhancing tumor (p = 0.03), while no survival benefit was observed in association with the extensive resection in the diffusion-dominant subtype (p=0.86).CONCLUSIONSVFLAIR/VCE-T1WIis an important classifier that could divide the HGA into 2 subtypes with distinct invasive features. Patients with proliferation-dominant HGA can benefit from extensive resection of the FLAIR abnormality region, which provides the theoretical basis for a personalized resection strategy.


Risks ◽  
2021 ◽  
Vol 9 (6) ◽  
pp. 103
Author(s):  
Morne Joubert ◽  
Tanja Verster ◽  
Helgard Raubenheimer ◽  
Willem D. Schutte

Survival analysis is one of the techniques that could be used to predict loss given default (LGD) for regulatory capital (Basel) purposes. When using survival analysis to model LGD, a proposed methodology is the default weighted survival analysis (DWSA) method. This paper is aimed at adapting the DWSA method (used to model Basel LGD) to estimate the LGD for International Financial Reporting Standard (IFRS) 9 impairment requirements. The DWSA methodology allows for over recoveries, default weighting and negative cashflows. For IFRS 9, this methodology should be adapted, as the estimated LGD is a function of in the expected credit losses (ECL). Our proposed IFRS 9 LGD methodology makes use of survival analysis to estimate the LGD. The Cox proportional hazards model allows for a baseline survival curve to be adjusted to produce survival curves for different segments of the portfolio. The forward-looking LGD values are adjusted for different macro-economic scenarios and the ECL is calculated for each scenario. These ECL values are probability weighted to produce a final ECL estimate. We illustrate our proposed IFRS 9 LGD methodology and ECL estimation on a dataset from a retail portfolio of a South African bank.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Maryam Farhadian ◽  
Sahar Dehdar Karsidani ◽  
Azadeh Mozayanimonfared ◽  
Hossein Mahjub

Abstract Background Due to the limited number of studies with long term follow-up of patients undergoing Percutaneous Coronary Intervention (PCI), we investigated the occurrence of Major Adverse Cardiac and Cerebrovascular Events (MACCE) during 10 years of follow-up after coronary angioplasty using Random Survival Forest (RSF) and Cox proportional hazards models. Methods The current retrospective cohort study was performed on 220 patients (69 women and 151 men) undergoing coronary angioplasty from March 2009 to March 2012 in Farchshian Medical Center in Hamadan city, Iran. Survival time (month) as the response variable was considered from the date of angioplasty to the main endpoint or the end of the follow-up period (September 2019). To identify the factors influencing the occurrence of MACCE, the performance of Cox and RSF models were investigated in terms of C index, Integrated Brier Score (IBS) and prediction error criteria. Results Ninety-six patients (43.7%) experienced MACCE by the end of the follow-up period, and the median survival time was estimated to be 98 months. Survival decreased from 99% during the first year to 39% at 10 years' follow-up. By applying the Cox model, the predictors were identified as follows: age (HR = 1.03, 95% CI 1.01–1.05), diabetes (HR = 2.17, 95% CI 1.29–3.66), smoking (HR = 2.41, 95% CI 1.46–3.98), and stent length (HR = 1.74, 95% CI 1.11–2.75). The predictive performance was slightly better by the RSF model (IBS of 0.124 vs. 0.135, C index of 0.648 vs. 0.626 and out-of-bag error rate of 0.352 vs. 0.374 for RSF). In addition to age, diabetes, smoking, and stent length, RSF also included coronary artery disease (acute or chronic) and hyperlipidemia as the most important variables. Conclusion Machine-learning prediction models such as RSF showed better performance than the Cox proportional hazards model for the prediction of MACCE during long-term follow-up after PCI.


Author(s):  
Yuko Yamaguchi ◽  
Marta Zampino ◽  
Toshiko Tanaka ◽  
Stefania Bandinelli ◽  
Yusuke Osawa ◽  
...  

Abstract Background Anemia is common in older adults and associated with greater morbidity and mortality. The causes of anemia in older adults have not been completely characterized. Although elevated circulating growth and differentiation factor 15 (GDF-15) has been associated with anemia in older adults, it is not known whether elevated GDF-15 predicts the development of anemia. Methods We examined the relationship between plasma GDF-15 concentrations at baseline in 708 non-anemic adults, aged 60 years and older, with incident anemia during 15 years of follow-up among participants in the Invecchiare in Chianti (InCHIANTI) Study. Results During follow-up, 179 (25.3%) participants developed anemia. The proportion of participants who developed anemia from the lowest to highest quartile of plasma GDF-15 was 12.9%, 20.1%, 21.2%, and 45.8%, respectively. Adults in the highest quartile of plasma GDF-15 had an increased risk of developing anemia (Hazards Ratio 1.15, 95% Confidence Interval 1.09, 1.21, P&lt;.0001) compared to those in the lower three quartiles in a multivariable Cox proportional hazards model adjusting for age, sex, serum iron, soluble transferrin receptor, ferritin, vitamin B12, congestive heart failure, diabetes mellitus, and cancer. Conclusions Circulating GDF-15 is an independent predictor for the development of anemia in older adults.


Sign in / Sign up

Export Citation Format

Share Document