scholarly journals 1082. Meta-Analysis of Survival Outcomes in People Who Inject Drugs After Cardiac Surgery for Infective Endocarditis

2018 ◽  
Vol 5 (suppl_1) ◽  
pp. S323-S324
Author(s):  
David Goodman-Meza ◽  
Robert E Weiss ◽  
Sebastián Gamboa ◽  
Abel Gallegos ◽  
Raphael J Landovitz ◽  
...  

Abstract Background The United States’ opioid epidemic has led to an increase in people who inject drugs (PWID) and opioid-associated infections, including infectious endocarditis (IE). Cardiac surgery is often indicated in IE to improve outcomes but is controversial in PWID due to the concerns about continued injection drug use leading to risk for reinfection and decreased survival. In response, we assessed the long-term survival after cardiac valve surgery in PWID compared with people who do not inject drugs (non-PWID) in the published literature. Methods We performed a systematic review and meta-analysis (MA) of studies that reported survival data after surgery for IE in PWID. We searched PUBMED up to April 2018. We extracted Kaplan–Meier (KM) curves from included studies. From the KM curves, we used an algorithm to estimate individual participant data (eIPD). In a one-step approach, we ran a Cox proportional hazards (CPH) model analysis of the eIPD with study random effects. In a two-step approach, we fitted CPH models by individual study; then, we ran a mixed-effects MA model of the log hazard ratios (HR) and standard errors. Results We identified 11 retrospective studies. Of these, six reported comparisons of PWID vs. non-PWID, and five reported results for PWID only. Based on eIPD, we included 407 PWID and 1,877 non-PWID. Mean age for PWID was 36.7 years (95% CI 34.4–39.1) and for non-PWID was 52.0 years (95% CI 45.3–59.4). There were 144 deaths (35.3%) in PWID and 559 (29.8%) deaths in non-PWID. We present by study and by group KM curves of eIPD (Figures 1 and 2). In one-step MA (included all 11 studies), the HR for PWID was 1.13 (95% CI 0.92–1.39). In two-step MA (included six comparison studies), heterogeneity was high (I2 = 72%); and there was no significant between-group difference (HR 1.29, 95% CI 0.80–2.07) (Figure 3). Conclusion Survival time post-surgery of PWID was similar to that of non-PWID. These estimates are concerning, as PWID on average are much younger than non-PWID with IE. Future studies should explore interventions to improve outcomes in PWID after surgery, including treatment of addiction during and after the index hospitalization and provision of naloxone at the time of discharge. Disclosures All authors: No reported disclosures.

2019 ◽  
Author(s):  
David Goodman-Meza ◽  
Robert E Weiss ◽  
Sebastian Gamboa ◽  
Abel Gallegos ◽  
Alex AT Bui ◽  
...  

Abstract Background: In the United States, the number of infective endocarditis (IE) cases associated with injection drug use has increased. Clinical guidelines suggest deferring surgery for IE in people who inject drugs (PWID) due to a concern for worse outcomes in comparison to non-injectors (non-PWID). We performed a systematic review and meta-analysis of long-term outcomes in PWID who underwent cardiac surgery and compared these outcomes to non-PWID. Methods: We systematically searched for studies reported between 1965 and 2018. We used an algorithm to estimate individual patient data (eIPD) from Kaplan-Meier (KM) curves and combined it with published individual patient data (IPD) to analyze long-term outcomes after cardiac surgery for IE in PWID . Our primary outcome was survival. Secondary outcomes were reoperation and mortality at 30-days, one-, five-, and 10-years. Random effects Cox regression was used for estimating survival. Results: We included 27 studies in the systematic review and 19 provided data (KM or IPD) for the meta-analysis. PWID were younger and more likely to have S. aureus than non-PWID. Survival at 30-days, one-, five-, and 10-years was 94.3%, 81.0%, 62.1%, and 56.6% in PWID, respectively; and 96.4%, 85.0%, 70.3%, and 63.4% in non-PWID. PWID had 47% greater hazard of death (HR 1.47, 95% CI, 1.05-2.05) and more than twice the hazard of reoperation (HR 2.37, 95% CI, 1.25-4.50) than non-PWID. Conclusion: PWID were younger and had shorter survival that non-PWID. Implementing evidence-based interventions and testing new modalities are urgently needed to improve outcomes in PWID after cardiac surgery.


Rare Tumors ◽  
2009 ◽  
Vol 1 (2) ◽  
pp. 159-163
Author(s):  
Jennifer L Beebe-Dimmer ◽  
Karynsa Cetin ◽  
Jon P Fryzek ◽  
Scott M Schuetze ◽  
Kendra Schwartz

Malignant giant cell tumor (GCT) of bone is a rare tumor with debilitating consequences. Patients with GCT of bone typically present with mechanical difficulty and pain as a result of bone destruction and are at an increased risk for fracture. Because of its unusual occurrence, little is known about the epidemiology of malignant GCT of bone. This report offers the first reliable population-based estimates of incidence, patient demographics, treatment course and survival for malignancy in GCT of bone in the United States. Using data from the National Cancer Institute's Surveillance, Epidemiology and End Results (SEER) program, we estimated the overall incidence and determinants of survival among patients diagnosed with malignant GCT of bone from 1975–2004. Cox proportional hazards regression was used to evaluate demographic and clinical determinants of survival among malignant GCT cases. Based on analyses of 117 malignant GCT cases, the estimated annual incidence in the United States was 1.6 per 10,000,000 persons per year. Incidence was highest among adults aged 20 to 44 years (2.4 per 10,000,000 per year) and most patients were diagnosed with localized (31.6%) or regional (29.9%) disease compared to distant disease (16.2%). Approximately 85% of patients survived at least 5 years, with survival poorest among older patients and those with evidence of distant metastases at time of diagnosis. The current study represents the largest systematic investigation examining the occurrence and distribution of malignancy in GCT of bone in the general U.S. population. We confirm its rare occurrence and suggest that age and stage at diagnosis are strongly associated with long-term survival.


2021 ◽  
pp. 152692482110027
Author(s):  
Robert M. Shavelle ◽  
Ji Hun Kwak ◽  
Rachel Saur ◽  
Jordan C. Brooks ◽  
Philip Rosenthal

Background: Hepatocelluar carcinoma typically occurs with underlying cirrhosis. However roughly 20% of cases arise in a non-cirrhotic liver. There is limited literature that addresses the long-term survival of the narrow subgroup who received transplantation. For such patients we sought to calculate life expectancies both at time of transplant and several years later, stratified by key risk factors, and to determine if survival has improved in recent years. Such information can be helpful in making treatment decisions. Methods: Data on 4,373 non-cirrhotic HCC patients who underwent liver transplantation in the MELD era (2002-2018) from the United States OPTN database were analyzed using the Cox proportional hazards regression model and life table methods. Results: Demographic and past medical history factors related to survival were patient age, donor age over 20, and the presence of ascites or severe hepatic encephalopathy. Survival did not vary by race or sex. HCC-specific factors significantly related to survival were the total number of tumors, extrahepatic spread, lymph node involvement, satellite lesions, micro- or macrovascular invasion, tumor differentiation (grade), and pre-transplant treatment. Survival improved over the study period, at 4% per calendar year during the first 5 years post transplant and 1% per year thereafter. Conclusions: Life expectancy in non-cirrhotic HCC transplant patients is much reduced from normal, and varies according to age and tumor-related factors. Survival improved modestly over the study period.


Circulation ◽  
2007 ◽  
Vol 116 (suppl_16) ◽  
Author(s):  
Billie Jean Martin ◽  
Dimitri Kalavrouziotis ◽  
Roger Baskett

Introduction While there are rigourous assessments made of trainees’ knowledge through formal examinations, objective assessments of technical skills are not available. Little is known about the safety of allowing resident trainees to perform cardiac surgical operations. Methods Peri-operative date was prospectively collected on all patients who underwent coronary artery bypass grafting (CABG), aortic valve replacement (AVR) or a combined procedure between 1998 and 2005. Teaching-cases were identified by resident records and defined as cases which the resident performed skin to skin. Pre-operative characteristics were compared between teaching and non-teaching cases. Short-term adverse events were defined as a composite of: in-hospital mortality, stroke, intra- or post-operative intra-aortic balloon pump (IABP) insertion, myocardial infarction, renal failure, wound infection, sepsis or return to the operating room. Intermediate adverse outcomes were defined as hospital readmission for any cardiac disease or late mortality. Logistic regression and Cox proportional hazard models were used to adjust for differences in age, acuity, and medical co-morbidities. Outcomes were compared between teaching and non-teaching cases. Results 6929 cases were included, 895 of which were identified as teaching-cases. Teaching-cases were more likely to have an EF<40%, pre-operative IABP, CHF, combined CABG/AVRs or total arterial grafting cases (all p<0.01). However, a case being a teaching-case was not a predictor of in-hospital mortality (OR=1.02, 95%CI 0.67–1.55) or the composite short-term outcome (OR=0.97, 95%CI 0.75–1.24). The Kaplan-Meier event-free survival of staff and teaching-cases was equivalent at 1, 3, and 5 years: 80% vs. 78%, 67% vs. 66%, and 58% vs. 55% (log-rank p=0.06). Cox proportional hazards regression modeling did not demonstrate teaching-case to be a predictor of late death or re-hospitalization (HR=1.05, 95%CI 0.94 –1.18). Conclusions Teaching-cases were more likely to have greater acuity and complexity than non-teaching cases. Despite this, teaching cases did no worse than staff cases in the short or intermediate term. Allowing residents to perform cardiac surgery does not appear to adversely affect patient outcomes.


2021 ◽  
Vol 39 (15_suppl) ◽  
pp. 4530-4530
Author(s):  
Sarah Fleming ◽  
Dina Gifkins ◽  
Waleed Shalaby ◽  
Jianjun Gao ◽  
Philip Rosenberg ◽  
...  

4530 Background: FGFRa appear in approximately 15% of cases of mUC. Data on whether FGFRa in mUC have a prognostic impact or predictive benefit for particular treatments have been limited by small sample sizes. The objective of this study was to evaluate the association between tumor FGFRa and clinical outcomes of patients with advanced UC or mUC regardless of therapy type and status. Methods: A convenience sample of oncologists and urologists across the United States provided patient level data on 400 patients with stage IIIb or IV UC via a standardized questionnaire over a 1-month period (August 17, 2020 – September 20, 2020). Study design enriched for FGFRa by requiring physicians to provide ≥1 FGFRa patient record. The questionnaire included physician characteristics, patient demographic information, FGFR status, therapy given, response, and clinical and radiographic measures of progression. Patient records were eligible for inclusion if they were identified and treated during July 1, 2017, to June 30, 2019. Cox proportional hazards models were used to estimate adjusted risk of disease progression by FGFR status. Results: A total of 104 physicians (58.7% medical oncologists, 31.7% hematologic oncologists, and 9.6% urologic oncologists) contributed 414 patient records Overall, 73.9% of the patients were male and the average age was 64.5 years (SD ±10.6). Median follow-up was 15 months. Of the 414 patients, 218 (52.7%) had FGFRa and 196 (47.3%) had FGFR wild-type ( FGFRwt) mUC . Of the 218 patients with FGFRa, 47.2% were treated with front-line chemo, 27.5% with a programmed death-ligand 1 inhibitor (PD-L1), 11.5% with chemo + PD-L1, and 13.8% with other treatments. Of the 196 FGFRwt patients, 63.2% were treated with front-line chemo, 21.9% with PD-L1, 12.2% with chemo + PD-L1, and 2.6% with other treatments. There was no difference in response or progression status for those receiving front-line chemo (HR, 1.15; 95% CI, 0.86-1.55). Among 97 patients (55 FGFRa and 42 FGFRwt) who received PD-L1 alone as front-line therapy, those who had FGFRa had an adjusted risk of progression 2 times higher than their FGFRwt counterparts (HR, 2.12; 95% CI, 1.13-4.00). Conclusions: Patients with FGFRa mUC progressed earlier than FGFRwt patients treated with front-line PDL-1 inhibitors; however, there was no difference in progression in patients treated with chemo based upon FGFR status. This real-world study using a survey design efficiently generated a relatively large FGFRa dataset, mitigating a core limitation of other studies assessing the patient population with FGFRa. Further work is warranted to validate these results and determine the optimal strategy for treating the patient with FGFRa mUC. Gene expression profiling of FGFRa mUC samples from clinical trials will help determine the potential impact of subtype or other features that may associate with benefit from therapy.


Circulation ◽  
2020 ◽  
Vol 142 (Suppl_3) ◽  
Author(s):  
Samuel T Kim ◽  
Mark R Helmers ◽  
Peter Altshuler ◽  
Amit Iyengar ◽  
Jason Han ◽  
...  

Introduction: Although guidelines for heart transplant currently recommend against donors weighing ≥ 30% less than the recipient, recent studies have shown that the detriment of under-sizing may not be as severe in obese recipients. Furthermore, predicted heart mass (PHM) has been shown to be more reliable for size matching compared to metrics such as weight and body surface area. In this study, we use PHM to characterize the effects of undersized heart transplantation (UHT) in obese vs. non-obese recipients. Methods: Retrospective analysis of the UNOS database was performed for heart transplants from Jan. 1995 to Sep. 2020. Recipients were stratified by obese (BMI ≥ 30) and non-obese (30 > BMI ≥ 18.5). Undersized donors were defined as PHM ≥ 20% less than recipient PHM. Obese and non-obese populations separately underwent propensity score matching, and Kaplan-Meier estimates were used to graph survival. Multivariable Cox proportional-hazards analyses were used to adjust for confounders and estimate the hazard ratio for death attributable to under-sizing. Results: Overall, 50,722 heart transplants were included in the analysis. Propensity-score matching resulted in 2,214, and 1,011 well-matched pairs, respectively, for non-obese and obese populations. UHT in non-obese recipients resulted in similar 30-day mortality (5.7% vs. 6.3%, p = 0.38), but worse 15-year survival (38% vs. 35%, P = 0.04). In contrast, obese recipients with UHT saw similar 30-day mortality (6.4% vs. 5.5%, p = 0.45) and slightly increased 15-year survival (31% vs. 35%, P = 0.04). Multivariate Cox analysis showed that UHT resulted in an adjusted hazard ratio of 1.08 (95% CI 1.01 - 1.16) in non-obese recipients, and 0.87 (95% CI 0.78 - 0.98) in obese recipients. Conclusions: Non-obese patients with UHT saw worse long-term survival, while obese patients with UHT saw slightly increased survival. These findings may warrant reevaluation of the current size criteria for obese patients awaiting a heart.


2018 ◽  
Vol 5 (suppl_1) ◽  
pp. S426-S426
Author(s):  
Christopher M Rubino ◽  
Lukas Stulik ◽  
Harald Rouha ◽  
Zehra Visram ◽  
Adriana Badarau ◽  
...  

Abstract Background ASN100 is a combination of two co-administered fully human monoclonal antibodies (mAbs), ASN-1 and ASN-2, that together neutralize the six cytotoxins critical to S. aureus pneumonia pathogenesis. ASN100 is in development for prevention of S. aureus pneumonia in mechanically ventilated patients. A pharmacometric approach to dose discrimination in humans was taken in order to bridge from dose-ranging, survival studies in rabbits to anticipated human exposures using a mPBPK model derived from data from rabbits (infected and noninfected) and noninfected humans [IDWeek 2017, Poster 1849]. Survival in rabbits was assumed to be indicative of a protective effect through ASN100 neutralization of S. aureus toxins. Methods Data from studies in rabbits (placebo through 20 mg/kg single doses of ASN100, four strains representing MRSA and MSSA isolates with different toxin profiles) were pooled with data from a PK and efficacy study in infected rabbits (placebo and 40 mg/kg ASN100) [IDWeek 2017, Poster 1844]. A Cox proportional hazards model was used to relate survival to both strain and mAb exposure. Monte Carlo simulation was then applied to generate ASN100 exposures for simulated patients given a range of ASN100 doses and infection with each strain (n = 500 per scenario) using a mPBPK model. Using the Cox model, the probability of full protection from toxins (i.e., predicted survival) was estimated for each simulated patient. Results Cox models showed that survival in rabbits is dependent on both strain and ASN100 exposure in lung epithelial lining fluid (ELF). At human doses simulated (360–10,000 mg of ASN100), full or substantial protection is expected for all four strains tested. For the most virulent strain tested in the rabbit pneumonia study (a PVL-negative MSSA, Figure 1), the clinical dose of 3,600 mg of ASN100 provides substantially higher predicted effect relative to lower doses, while doses above 3,600 mg are not predicted to provide significant additional protection. Conclusion A pharmacometric approach allowed for the translation of rabbit survival data to infected patients as well as discrimination of potential clinical doses. These results support the ASN100 dose of 3,600 mg currently being evaluated in a Phase 2 S. aureus pneumonia prevention trial. Disclosures C. M. Rubino, Arsanis, Inc.: Research Contractor, Research support. L. Stulik, Arsanis Biosciences GmbH: Employee, Salary. H. Rouha, 3Arsanis Biosciences GmbH: Employee, Salary. Z. Visram, Arsanis Biosciences GmbH: Employee, Salary. A. Badarau, Arsanis Biosciences GmbH: Employee, Salary. S. A. Van Wart, Arsanis, Inc.: Research Contractor, Research support. P. G. Ambrose, Arsanis, Inc.: Research Contractor, Research support. M. M. Goodwin, Arsanis, Inc.: Employee, Salary. E. Nagy, Arsanis Biosciences GmbH: Employee, Salary.


2021 ◽  
Vol 11 ◽  
Author(s):  
Duorui Nie ◽  
Guihua Lai ◽  
Guilin An ◽  
Zhuojun Wu ◽  
Shujun Lei ◽  
...  

BackgroundMetastatic pancreatic cancer (mPC) is a highly lethal malignancy with poorer survival. However, chemotherapy alone was unable to maintain long‐term survival. This study aimed to evaluate the individualized survival benefits of pancreatectomy plus chemotherapy (PCT) for mPC.MethodsA total of 4546 patients with mPC from 2004 to 2015 were retrieved from the Surveillance, Epidemiology, and End Results database. The survival curve was calculated using the Kaplan-Meier method and differences in survival curves were tested using log-rank tests. Cox proportional hazards regression analyses were performed to evaluate the prognostic value of involved variables. A new nomogram was constructed to predict overall survival based on independent prognosis factors. The performance of the nomogram was measured by concordance index, calibration plot, and area under the receiver operating characteristic curve.ResultsCompared to pancreatectomy or chemotherapy alone, PCT can significantly improve the prognosis of patients with mPC. In addition, patients with well/moderately differentiated tumors, age ≤66 years, tumor size ≤42 mm, or female patients were more likely to benefit from PCT. Multivariate analysis showed that age at diagnosis, sex, marital status, grade, tumor size, and treatment were independent prognostic factors. The established nomogram has a good ability to distinguish and calibrating.ConclusionPCT can prolong survival in some patients with mPC. Our nomogram can individualize predict OS of pancreatectomy combined with chemotherapy in patients with concurrent mPC.


2020 ◽  
Author(s):  
Heng Zou ◽  
Wenhao Chen ◽  
Huan Wang ◽  
Li Xiong ◽  
Yu Wen ◽  
...  

Abstract Overview and objective: Although evidence for the application of albumin–bilirubin (ALBI) grading system to assess liver function in hepatocellular carcinoma (HCC) is available, less is known whether it can be applied to determine the prognosis of single HCC with different tumor sizes. This study aimed to address this gap.Methods: Here, we enrolled patients who underwent hepatectomy due to single HCC from the year 2010 to 2014. Analyses were performed to test the potential of ALBI grading system to monitor the long-term survival of single HCC subjects with varying tumor sizes.Results: Overall, 265 participants were recruited. The overall survival (OS) among patients whose tumors were ≤ 7 cm was remarkably higher compared to those whose tumors were > 7 cm. The Cox proportional hazards regression model identified the tumor differentiation grade, ALBI grade, and maximum tumor size as key determinants of the OS. The ALBI grade could stratify the patients who had a single tumor ≤ 7 cm into two distinct groups with different prognoses. The OS between ALBI grades 1 and 2 was comparable for patients who had a single tumor > 7 cm.Conclusions: We show that ALBI grading system can predict disease outcomes of single HCC patients with tumor size ≤ 7 cm. However, the ALBI grade may not predict capability the prognosis of patients with single tumor > 7 cm.


Author(s):  
David A. Baran ◽  
Justin Lansinger ◽  
Ashleigh Long ◽  
John M. Herre ◽  
Amin Yehya ◽  
...  

Background: The opioid crisis has led to an increase in available donor hearts, although questions remain about the long-term outcomes associated with the use of these organs. Prior studies have relied on historical information without examining the toxicology results at the time of organ offer. The objectives of this study were to examine the long-term survival of heart transplants in the recent era, stratified by results of toxicological testing at the time of organ offer as well as comparing the toxicology at the time of donation with variables based on reported history. Methods: The United Network for Organ Sharing database was requested as well as the donor toxicology field. Between 2007 and 2017, 23 748 adult heart transplants were performed. United Network for Organ Sharing historical variables formed a United Network for Organ Sharing Toxicology Score and the measured toxicology results formed a Measured Toxicology Score. Survival was examined by the United Network for Organ Sharing Toxicology Score and Measured Toxicology Score, as well as Cox proportional hazards models incorporating a variety of risk factors. Results: The number and percent of donors with drug use has significantly increased over the study period ( P <0.0001). Cox proportional hazards modeling of survival including toxicological and historical data did not demonstrate differences in post-transplant mortality. Combinations of drugs identified by toxicology were not associated with differences in survival. Lower donor age and ischemic time were significantly positively associated with survival ( P <0.0001). Conclusions: Among donors accepted for transplantation, neither history nor toxicological evidence of drug use was associated with significant differences in survival. Increasing use of such donors may help alleviate the chronic donor shortage.


Sign in / Sign up

Export Citation Format

Share Document