Impact of radiotherapy duration on outcomes in patients with esophageal cancer treated with definitive concurrent radiotherapy and chemotherapy on RTOG trials 8501 and 9405.

2015 ◽  
Vol 33 (3_suppl) ◽  
pp. 119-119
Author(s):  
Christopher Leigh Hallemeier ◽  
Jennifer Moughan ◽  
Michael G. Haddock ◽  
Arnold M. Herskovic ◽  
Bruce D. Minsky ◽  
...  

119 Background: Radiotherapy (RT) interruptions have a negative impact on outcomes in many epithelial malignancies treated with definitive RT. The purpose of this study was to analyze the impact of RT duration on outcomes in patients (pts) with esophageal cancer treated with definitive chemoradiotherapy (CRT). Methods: Pts treated with definitive CRT on RTOG trials 8501 and 9405 were included. Separate analyses were performed in pts receiving standard dose (SD-CRT; 50 Gy + 5FU + cisplatin) and high dose (HD-CRT; 64.8 Gy + 5FU + cisplatin) CRT. Local (LF) and regional (RF) failure were estimated by the cumulative incidence method. Disease-free (DFS) and overall (OS) survival were estimated by the Kaplan-Meier method. Univariate (UVA) and multivariate (MVA) Cox proportional hazards models were utilized to examine for correlation between RT duration (< vs. ≥ median) with LF, RF, DFS and OS. Results: In the SD-CRT cohort (n=235), 96 pts (41%) had ≥ 1 RT interruption for a median of 3 (IQR 1-6) days. The median RT duration was 39 (IQR 37-43) days. In UVA and MVA, RT duration was not associated with LF, RF, DFS, or OS. Estimated outcome rates are in the table. In the HD-CRT cohort (n=107), 64 pts (60%) had ≥ 1 RT interruption for a median of 3.5 (IQR 2-7.5) days. The median RT duration was 52 (IQR 50-57) days. In UVA, RT duration ≥ 52 days was associated with a 33% reduction in risk of DFS failure (HR=0.66, 95% CI [0.44-0.98], p=0.039) and a 29% reduction in risk of death (HR=0.71, 95% CI [0.48-1.06], p=0.09). When excluding the 25 pts with RT dose < 64.8 Gy, RT duration was not associated with DFS or OS. Conclusions: In pts with esophageal cancer receiving definitive SD-CRT, an association between RT duration and outcomes was not observed. In pts receiving HD-CRT, longer RT duration was associated with improved DFS, which may have been due to a significant number of deaths at RT dose < 64.8 Gy. Supported by NCI U10 grants CA21661, CA180868, CA180822, CA37422. Clinical trial information: NCT00002631. [Table: see text]

Medicina ◽  
2010 ◽  
Vol 46 (8) ◽  
pp. 516 ◽  
Author(s):  
Asta Stankuvienė ◽  
Edita Žiginskienė ◽  
Vytautas Kuzminskis ◽  
Inga Bumblytė

Introduction. The question of the targets of dialysis dosing remains controversial since the beginning of the long-term dialysis treatment era. It is still uncertain if higher dialysis dose is better. The aim of our study was to investigate issues of dialysis dose in Lithuania during the period of 1998–2005 and to determine associations between hemodialysis dose and survival of patients on chronic hemodialysis. Material and methods. We analyzed data of all patients who started hemodialysis due to endstage renal disease in Lithuania between January 1, 1998, and December 31, 2005. The information about hemodialysis frequency, duration, and adequacy (according to Kt/V) was obtained from medical documentation. The overall survival rate was estimated using the Kaplan- Meier method. Survival comparisons were made using the log-rank or Breslow tests. Univariate Cox proportional hazards analysis was used to select variables significantly associated with the risk of death; then these variables were included in multivariate Cox proportional hazards models. Results. During the study period, from 2428 patients who started chronic hemodialysis, 58.5% of patients started hemodialysis three times a week. More than one-third (36.2%) of patients were dialyzed twice weekly, and 5.3% of patients started hemodialysis once weekly. Survival analysis revealed that patients dialyzed less than three times per week survived shorter than patients receiving a higher dialysis dose. Duration of HD session of ≤8 hours per week was an independent risk factor for mortality. A higher mean Kt/V was associated with better survival of patients on chronic hemodialysis. Conclusions. Dialysis frequency and weekly duration of HD sessions were dependent on HD accessibility in Lithuania during the period of 1998–2005. Better survival of patients on chronic hemodialysis was associated with a higher hemodialysis dose.


2020 ◽  
Author(s):  
Jianbiao Xiao ◽  
Yi Ding ◽  
Lanwei Xu ◽  
Wei Wang ◽  
Fen Chen ◽  
...  

Abstract Background The purpose of this study is to analyze the impact of post-operative radiotherapy on the results in patients with intracranial hemangiopericytoma (HPC). Materials and methods We retrospectively reviewed 66 intracranial HPC patients between 1999 and 2019 including 29 with surgery followed by radiotherapy (11 with intensity-modulated radiotherapy (IMRT) and 18 with stereotactic radiosurgery (SRS)) and 37 with surgery alone. Chi-squared test was used to compare the clinical characteristic between the groups. The Kaplan-Meier method was used to analyze overall survival (OS) and recurrence-free survival (RFS). Multivariate Cox proportional hazards models were used to examine prognostic factors of survival. Results The crude local control rates were 58.6% in the surgery plus post-operative radiotherapy group (PORT) and 67.6% in the surgery alone group (p = 0.453). In the subgroup analysis of the PORT patients, local controls were 72.7% in the IMRT group and 50% in the SRS group (p = 0.228). The median OS in the PORT and surgery groups were 122 months and 98 months, respectively (p = 0.169). The median RFS was 96 months in the PORT group and 72 months in the surgery alone group (p = 0.714). The median OS and RFS of the SRS group were not significantly better than those in the IMRT group (p = 0.256, 0.960). The median RFS were 112 and 72 months for pathology grade II and III patients, respectively (p = 0.001). Conclusion PORT did not improve the local control rates nor the survivals. The local control rates after IMRT and SRS were similar even though the IMRT technique had a much higher biological dose compared with the SRS technique. PORT is not indicated for intracranial HPC patients with a complete resection margin.


Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Neil J Wimmer ◽  
Kyle Smoot ◽  
Kelly Cho ◽  
Galina Sokolovskaya ◽  
David Gagnon ◽  
...  

Introduction: Clopidogrel up to 12 months after an acute coronary syndrome (ACS) lowers the risk of major adverse coronary events. Guidelines recommend clopidogrel for 12 months after coronary intervention (PCI) with bare-metal (BMS) or drug-eluting (DES) stents. The value of prolonging clopidogrel beyond 12 months in PCI patients with ACS is unknown. Hypothesis: We hypothesize that prolonged clopidogrel differentially reduces adverse events in patients receiving DES compared with those receiving BMS for an ACS. Methods: We linked all PCIs in the national Veterans Affairs (VA) health system from 2002-2006 with the VA pharmacy database. VA and non-VA clinical outcomes were identified by ICD9 codes from the VA and CMS/Medicare databases. All patients who were event free at 12 months were followed for death, myocardial infarction (MI), and recurrent target vessel revascularization (TVR). The hazard ratio for prolonged clopidogrel use (more than 12 months) versus clopidogrel for less than or equal to 12 months was assessed for each outcome within each stent type, using Cox proportional hazards models. Multivariable models and propensity models were also used to adjust for confounding related to the propensity for prolonged clopidogrel treatment. Results: After exclusions, 28,507 patients had PCI and were event free at 12 months. Of these, 18,279 (64%) had an ACS at their index PCI including 9,336 (51%) who received a DES and 8,447 (46%) who received prolonged clopidogrel. Events occurring up to 4 years after PCI in these ACS patients included 1358 deaths, 1885 death or MIs, and 1335 TVRs. Prolonged clopidogrel was associated with lower risk of death (interaction p=0.04) and death or MI (interaction p=0.001) only in patients receiving DES (Table). Similar results were seen in multivariable and propensity models. Conclusions: Prolonging clopidogrel more than 12 months after PCI for ACS may lower the risk of death or MI in patients receiving DES but not in patients receiving BMS.


2020 ◽  
Author(s):  
Guillermo Suarez-Cuartin ◽  
Merce Gasa ◽  
Guadalupe Bermudo ◽  
Yolanda Ruiz-Albert ◽  
Marta Hernandez-Argudo ◽  
...  

Abstract Background: Many severe COVID-19 patients require respiratory support and monitoring. An intermediate respiratory care unit (IMCU) may be a valuable element for optimizing patient care and limited health-care resources management. We aim to assess the impact of an IMCU in the management of severe COVID-19.Methods: Observational, retrospective study including patients admitted to the IMCU due to COVID-19 pneumonia during the months of March and April 2020. Patients were stratified based on their requirement of transfer to the intensive care unit (ICU) and on survival status at the end of follow-up. A multivariable Cox proportional hazards method was used to assess risk factors associated with mortality.Results: A total of 253 patients were included. Of them, 68% were male and median age was 65 years (IQR 18 years). Ninety-two patients (36.4%) required ICU transfer. Patients transferred to the ICU had a higher mortality rate (44.6% Vs 24.2%; p<0.001). Multivariable proportional hazards model showed that age ³65 years (HR 4.14; 95%CI 2.31-7.42; p<0.001); chronic respiratory conditions (HR 2.34; 95%CI 1.38-3.99; p=0.002) and chronic kidney disease (HR 2.96; 95%CI 1.61-5.43; p<0.001) were independently associated with mortality. High-dose systemic corticosteroids followed by progressive dose tapering showed a lower risk of death (HR 0.15; 95%CI 0.06-0.40; p<0.001). Conclusions: IMCU allow to safely and effectively manage severe COVID-19 patients requiring respiratory support and non-invasive monitoring, therefore reducing ICU burden. Older age and chronic respiratory or renal conditions are associated with worse clinical outcomes, while treatment with systemic corticosteroids may have a protective effect on mortality.


2021 ◽  
pp. 1-9
Author(s):  
Leonard Naymagon ◽  
Douglas Tremblay ◽  
John Mascarenhas

Data supporting the use of etoposide-based therapy in hemophagocytic lymphohistiocytosis (HLH) arise largely from pediatric studies. There is a lack of comparable data among adult patients with secondary HLH. We conducted a retrospective study to assess the impact of etoposide-based therapy on outcomes in adult secondary HLH. The primary outcome was overall survival. The log-rank test was used to compare Kaplan-Meier distributions of time-to-event outcomes. Multivariable Cox proportional hazards modeling was used to estimate adjusted hazard ratios (HRs) with 95% confidence intervals (CIs). Ninety adults with secondary HLH seen between January 1, 2009, and January 6, 2020, were included. Forty-two patients (47%) received etoposide-based therapy, while 48 (53%) received treatment only for their inciting proinflammatory condition. Thirty-three patients in the etoposide group (72%) and 32 in the no-etoposide group (67%) died during follow-up. Median survival in the etoposide and no-etoposide groups was 1.04 and 1.39 months, respectively. There was no significant difference in survival between the etoposide and no-etoposide groups (log-rank <i>p</i> = 0.4146). On multivariable analysis, there was no association between treatment with etoposide and survival (HR for death with etoposide = 1.067, 95% CI: 0.633–1.799, <i>p</i> = 0.8084). Use of etoposide-based therapy was not associated with improvement in outcomes in this large cohort of adult secondary HLH patients.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
S Kochav ◽  
R.C Chen ◽  
J.M.D Dizon ◽  
J.A.R Reiffel

Abstract Background Theoretical concern exists regarding AV block (AVB) with class I antiarrhythmics (AADs) when bundle branch block (BBB) is present. Whether this is substantiated in real-world populations is unknown. Purpose To determine the relationship between type of AAD and incidence of AVB in patients with preexisting BBB. Methods We retrospectively studied all patients with BBB who received class I and III AADs between 1997–2019 to compare incidence of AVB. We defined index time as first exposure to either drug class and excluded patients with prior AVB or exposed to both classes. Time-at-risk window ended at first outcome occurrence or when patients were no longer observed in the database. We estimated hazard ratios for incident AVB using Cox proportional hazards models with propensity score stratification, adjusting for over 32,000 covariates from the electronic health record. Kaplan-Meier methods were used to determine treatment effects over time. Results Of 40,120 individuals with BBB, 148 were exposed to a class I AAD and 2401 to a class III AAD. Over nearly 4,200 person-years of follow up, there were 22 and 620 outcome events in the class I and class III cohorts, respectively (Figure). In adjusted analyses, AVB risk was markedly lower in patients exposed to class I AADs compared with class III (HR 0.48 [95% CI 0.30–0.75]). Conclusion Among patients with BBB, exposure to class III AADs was strongly associated with greater risk of incident AVB. This likely reflects differences in natural history of patients receiving class I vs class III AADs rather than adverse class III effects, however, the lack of worse outcomes acutely with class I AADs suggests that they may be safer in BBB than suspected. Funding Acknowledgement Type of funding source: None


2021 ◽  
Vol 39 (15_suppl) ◽  
pp. 4142-4142
Author(s):  
Lucy Xiaolu Ma ◽  
Gun Ho Jang ◽  
Amy Zhang ◽  
Robert Edward Denroche ◽  
Anna Dodd ◽  
...  

4142 Background: KRAS mutations (m) (KRASm) are present in over 90% of pancreatic adenocarcinomas (PDAC) with a predominance of G12 substitutions. KRAS wildtype (WT) PDAC relies on alternate oncogenic drivers, and the prognostic impact of these remains unknown. We evaluated alterations in WT PDAC and explored the impact of specific KRASm and WT status on survival. Methods: WGS and RNAseq were performed on 570 patients (pts) ascertained through our translational research program from 2012-2021, of which 443 were included for overall survival (OS) analyses. This included 176 pts with resected and 267 pts with advanced PDAC enrolled on the COMPASS trial (NCT02750657). The latter cohort underwent biopsies prior to treatment with first line gemcitabine-nab-paclitaxel or mFOLFIRINOX as per physician choice. The Kaplan-Meier and Cox proportional hazards methods were used to estimate OS. Results: KRAS WT PDAC (n = 52) represented 9% of pts, and these cases trended to be younger than pts with KRASm (median age 61 vs 65 years p = 0.1). In resected cases, the most common alterations in WT PDAC (n = 23) included GNASm (n = 6) and BRAFm/fusions (n = 5). In advanced WT PDAC (n = 27), alterations in BRAF (n = 11) and ERBB2/3/4 (n = 6) were most prevalent. Oncogenic fusions (NTRK, NRG1, BRAF/RAF, ROS1, others) were identified in 9 pts. The BRAF in-frame deletion p.486_491del represented the most common single variant in WT PDAC, with organoid profiling revealing sensitivity to both 3rd generation BRAF inhibitors and MEK inhibition. In resected PDAC, multivariable analyses documented higher stage (p = 0.043), lack of adjuvant chemotherapy (p < 0.001), and the KRAS G12D variant (p = 0.004) as poor prognostic variables. In advanced disease, neither WT PDAC nor KRAS specific alleles had an impact on prognosis (median OS WT = 8.5 mths, G12D = 8.2, G12V = 10.0, G12R = 12.0, others = 9.2, p = 0.73); the basal-like RNA subtype conferred inferior OS (p < 0.001). A targeted therapeutic approach following first line chemotherapy was undertaken in 10% of pts with advanced PDAC: MMRd (n = 1), homologous recombination deficiency (HRD) (n = 19), KRASG12C (n = 1), CDK4/6 amplification (n = 3), ERBB family alterations (n = 2), BRAF variants (n = 2). OS in this group was superior (14.7 vs 8.8 mths, p = 0.04), mainly driven by HRD-PDAC where KRASm were present in 89%. Conclusions: In our dataset, KRAS G12D is associated with inferior OS in resected PDAC, however KRAS mutational status was not prognostic in advanced disease. This suggests that improved OS in the WT PDAC population can only be achieved if there is accelerated access to targeted drugs for pts.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Judy Tung ◽  
Musarrat Nahid ◽  
Mangala Rajan ◽  
Lia Logio

Abstract Background Academic medical centers invest considerably in faculty development efforts to support the career success and promotion of their faculty, and to minimize faculty attrition. This study evaluated the impact of a faculty development program called the Leadership in Academic Medicine Program (LAMP) on participants’ (1) self-ratings of efficacy, (2) promotion in academic rank, and (3) institutional retention. Method Participants from the 2013–2020 LAMP cohorts were surveyed pre and post program to assess their level of agreement with statements that spanned domains of self-awareness, self-efficacy, satisfaction with work and work environment. Pre and post responses were compared using McNemar’s tests. Changes in scores across gender were compared using Wilcoxon Rank Sum/Mann-Whitney tests. LAMP participants were matched to nonparticipant controls by gender, rank, department, and time of hire to compare promotions in academic rank and departures from the organization. Kaplan Meier curves and Cox proportional hazards models were used to examine differences. Results There were significant improvements in almost all self-ratings on program surveys (p < 0.05). Greatest improvements were seen in “understand the promotions process” (36% vs. 94%), “comfortable negotiating” (35% vs. 74%), and “time management” (55% vs. 92%). There were no statistically significant differences in improvements by gender, however women faculty rated themselves lower on all pre-program items compared to men. There was significant difference found in time-to-next promotion (p = 0.003) between LAMP participants and controls. Kaplan-Meier analysis demonstrated that LAMP faculty achieved next promotion more often and faster than controls. Cox-proportional-hazards analyses found that LAMP faculty were 61% more likely to be promoted than controls (hazard ratio [HR] 1.61, 95% confidence interval [CI] 1.16–2.23, p-value = 0.004). There was significant difference found in time-to-departure (p < 0.0001) with LAMP faculty retained more often and for longer periods. LAMP faculty were 77% less likely to leave compared to controls (HR 0.23, 95% CI 0.16–0.34, p < 0.0001). Conclusions LAMP is an effective faculty development program as measured subjectively by participant self-ratings and objectively through comparative improvements in academic promotions and institutional retention.


2017 ◽  
Vol 117 (06) ◽  
pp. 1072-1082 ◽  
Author(s):  
Xiaoyan Li ◽  
Steve Deitelzweig ◽  
Allison Keshishian ◽  
Melissa Hamilton ◽  
Ruslan Horblyuk ◽  
...  

SummaryThe ARISTOTLE trial showed a risk reduction of stroke/systemic embolism (SE) and major bleeding in non-valvular atrial fibrillation (NVAF) patients treated with apixaban compared to warfarin. This retrospective study used four large US claims databases (MarketScan, PharMetrics, Optum, and Humana) of NVAF patients newly initiating apixaban or warfarin from January 1, 2013 to September 30, 2015. After 1:1 warfarin-apixaban propensity score matching (PSM) within each database, the resulting patient records were pooled. Kaplan-Meier curves and Cox proportional hazards models were used to estimate the cumulative incidence and hazard ratios (HRs) of stroke/SE and major bleeding (identified using the first listed diagnosis of inpatient claims) within one year of therapy initiation. The study included a total of 76,940 (38,470 warfarin and 38,470 apixaban) patients. Among the 38,470 matched pairs, 14,563 were from MarketScan, 7,683 were from PharMetrics, 7,894 were from Optum, and 8,330 were from Humana. Baseline characteristics were balanced between the two cohorts with a mean (standard deviation [SD]) age of 71 (12) years and a mean (SD) CHA2DS2-VASc score of 3.2 (1.7). Apixaban initiators had a significantly lower risk of stroke/SE (HR: 0.67, 95 % CI: 0.59–0.76) and major bleeding (HR: 0.60, 95 % CI: 0.54–0.65) than warfarin initiators. Different types of stroke/SE and major bleeding – including ischaemic stroke, haemorrhagic stroke, SE, intracranial haemorrhage, gastrointestinal bleeding, and other major bleeding – were all significantly lower for apixaban compared to warfarin treatment. Subgroup analyses (apixaban dosage, age strata, CHA2DS2-VASc or HAS-BLED score strata, or dataset source) all show consistently lower risks of stroke/SE and major bleeding associated with apixaban as compared to warfarin treatment. This is the largest “real-world” study on apixaban effectiveness and safety to date, showing that apixaban initiation was associated with significant risk reductions in stroke/SE and major bleeding compared to warfarin initiation after PSM. These benefits were consistent across various high-risk subgroups and both the standard-and low-dose apixaban dose regimens.Note: The review process for this manuscript was fully handled by Christian Weber, Editor in Chief.Supplementary Material to this article is available online at www.thrombosis-online.com.


2019 ◽  
Vol 2019 ◽  
pp. 1-9 ◽  
Author(s):  
Si-wei Pan ◽  
Peng-liang Wang ◽  
Han-wei Huang ◽  
Lei Luo ◽  
Xin Wang ◽  
...  

Background. In gastric cancer, various surveillance strategies are suggested in international guidelines. The current study is intended to evaluate the current strategies and provide more personalized proposals for personalized cancer medicine. Materials and Methods. In the aggregate, 9191 patients with gastric cancer after gastrectomy from 1998 to 2009 were selected from the Surveillance, Epidemiology, and End Results database. Disease-specific survival was analyzed by Kaplan-Meier method and the log-rank test. Cox proportional hazards regression analyses were used to confirm the independent prognostic factors. As well, hazard ratio (HR) curves were used to compare the risk of death over time. Conditional survival (CS) was applied to dynamically assess the prognosis after each follow-up. Results. Comparisons from HR curves on different stages showed that earlier stages had distinctly lower HR than advanced stages. The curve of stage IIA was flat and more likely the same as that of stage I while that of stage IIB is like that of stage III with an obvious peak. After estimating CS at intervals of three months, six months, and 12 months in different periods, stages I and IIA had high levels of CS all along, while there were visible differences among CS levels of stages IIB and III. Conclusions. The frequency of follow-up for early stages, like stages I and IIA, could be every six months or longer in the first three years and annually thereafter. And those with unfavorable conditions, such as stages IIB and III, could be followed up much more frequently and sufficiently than usual.


Sign in / Sign up

Export Citation Format

Share Document