scholarly journals Intoxicated Donors and Heart Transplant Outcomes: Long-Term Safety

Author(s):  
David A. Baran ◽  
Justin Lansinger ◽  
Ashleigh Long ◽  
John M. Herre ◽  
Amin Yehya ◽  
...  

Background: The opioid crisis has led to an increase in available donor hearts, although questions remain about the long-term outcomes associated with the use of these organs. Prior studies have relied on historical information without examining the toxicology results at the time of organ offer. The objectives of this study were to examine the long-term survival of heart transplants in the recent era, stratified by results of toxicological testing at the time of organ offer as well as comparing the toxicology at the time of donation with variables based on reported history. Methods: The United Network for Organ Sharing database was requested as well as the donor toxicology field. Between 2007 and 2017, 23 748 adult heart transplants were performed. United Network for Organ Sharing historical variables formed a United Network for Organ Sharing Toxicology Score and the measured toxicology results formed a Measured Toxicology Score. Survival was examined by the United Network for Organ Sharing Toxicology Score and Measured Toxicology Score, as well as Cox proportional hazards models incorporating a variety of risk factors. Results: The number and percent of donors with drug use has significantly increased over the study period ( P <0.0001). Cox proportional hazards modeling of survival including toxicological and historical data did not demonstrate differences in post-transplant mortality. Combinations of drugs identified by toxicology were not associated with differences in survival. Lower donor age and ischemic time were significantly positively associated with survival ( P <0.0001). Conclusions: Among donors accepted for transplantation, neither history nor toxicological evidence of drug use was associated with significant differences in survival. Increasing use of such donors may help alleviate the chronic donor shortage.

Circulation ◽  
2007 ◽  
Vol 116 (suppl_16) ◽  
Author(s):  
Toru Aoyama ◽  
Hideki Ishii ◽  
Hiroshi Takahashi ◽  
Takanobu Toriyama ◽  
Toru Aoyama ◽  
...  

Background: The cardiovascular (CV) events and mortality are significantly higher in hemodialysis (HD) patents compared to the general population. Although it is of clinical concern to predict the occurrence of CV events in long-term HD patients, more powerful predictor has under exploration. We investigated as to whether silent brain infarction (SBI) would be a predictable factor for future CV events and mortality in a large cohort of patients with long-term HD patients. Methods: After cranial magnetic resonance imaging to detect SBI, 202 long-term HD patients (7.1 ± 5.9 years) without symptomatic stroke were prospectively followed up until the incident of CV events (stroke, cardiac events, and death). We analyzed the prognostic role of SBI in CV events with the Kaplan-Meier method and Cox proportional hazards analysis. Results: The prevalence of SBI was quite higher compared to the previous reports (71.8% in all the patients). In overall patients, 60 patients suffered from CV disease (31 for coronary artery disease, 7 for congestive heart failure, 14 for symptomatic stroke) and 29 patients died (16 for CV death) during a follow up period (mean= 23 ± 13 months). In subgroup analysis regarding the presence of SBI, CV event-free survival rate for 4 years was significantly lower in the patients with SBI compared to those without SBI (54.6% vs. 86.7%, p=0.0003). CV and overall mortality were also significantly higher in SBI patients compared with No-SBI patients (CV mortality; 20.5 % vs. 4.3 %, overall mortality; 29.0% vs. 9.1% p< 0.01, respectively). Cox proportional hazards models showed that the presence of SBI was a significant predictor of cerebrovascular and CV events and CV and overall mortality even after adjustment for other CV risk factors listed on the Table . Conclusion: SBI detected with MRI would be powerful predictor of CV events and mortality in long-term HD patients. Hazard ratio (HR) of SBI for future events and mortality


Circulation ◽  
2020 ◽  
Vol 142 (Suppl_3) ◽  
Author(s):  
Samuel T Kim ◽  
Mark R Helmers ◽  
Peter Altshuler ◽  
Amit Iyengar ◽  
Jason Han ◽  
...  

Introduction: Although guidelines for heart transplant currently recommend against donors weighing ≥ 30% less than the recipient, recent studies have shown that the detriment of under-sizing may not be as severe in obese recipients. Furthermore, predicted heart mass (PHM) has been shown to be more reliable for size matching compared to metrics such as weight and body surface area. In this study, we use PHM to characterize the effects of undersized heart transplantation (UHT) in obese vs. non-obese recipients. Methods: Retrospective analysis of the UNOS database was performed for heart transplants from Jan. 1995 to Sep. 2020. Recipients were stratified by obese (BMI ≥ 30) and non-obese (30 > BMI ≥ 18.5). Undersized donors were defined as PHM ≥ 20% less than recipient PHM. Obese and non-obese populations separately underwent propensity score matching, and Kaplan-Meier estimates were used to graph survival. Multivariable Cox proportional-hazards analyses were used to adjust for confounders and estimate the hazard ratio for death attributable to under-sizing. Results: Overall, 50,722 heart transplants were included in the analysis. Propensity-score matching resulted in 2,214, and 1,011 well-matched pairs, respectively, for non-obese and obese populations. UHT in non-obese recipients resulted in similar 30-day mortality (5.7% vs. 6.3%, p = 0.38), but worse 15-year survival (38% vs. 35%, P = 0.04). In contrast, obese recipients with UHT saw similar 30-day mortality (6.4% vs. 5.5%, p = 0.45) and slightly increased 15-year survival (31% vs. 35%, P = 0.04). Multivariate Cox analysis showed that UHT resulted in an adjusted hazard ratio of 1.08 (95% CI 1.01 - 1.16) in non-obese recipients, and 0.87 (95% CI 0.78 - 0.98) in obese recipients. Conclusions: Non-obese patients with UHT saw worse long-term survival, while obese patients with UHT saw slightly increased survival. These findings may warrant reevaluation of the current size criteria for obese patients awaiting a heart.


2020 ◽  
Author(s):  
Heng Zou ◽  
Wenhao Chen ◽  
Huan Wang ◽  
Li Xiong ◽  
Yu Wen ◽  
...  

Abstract Overview and objective: Although evidence for the application of albumin–bilirubin (ALBI) grading system to assess liver function in hepatocellular carcinoma (HCC) is available, less is known whether it can be applied to determine the prognosis of single HCC with different tumor sizes. This study aimed to address this gap.Methods: Here, we enrolled patients who underwent hepatectomy due to single HCC from the year 2010 to 2014. Analyses were performed to test the potential of ALBI grading system to monitor the long-term survival of single HCC subjects with varying tumor sizes.Results: Overall, 265 participants were recruited. The overall survival (OS) among patients whose tumors were ≤ 7 cm was remarkably higher compared to those whose tumors were > 7 cm. The Cox proportional hazards regression model identified the tumor differentiation grade, ALBI grade, and maximum tumor size as key determinants of the OS. The ALBI grade could stratify the patients who had a single tumor ≤ 7 cm into two distinct groups with different prognoses. The OS between ALBI grades 1 and 2 was comparable for patients who had a single tumor > 7 cm.Conclusions: We show that ALBI grading system can predict disease outcomes of single HCC patients with tumor size ≤ 7 cm. However, the ALBI grade may not predict capability the prognosis of patients with single tumor > 7 cm.


2019 ◽  
Vol 12 ◽  
pp. 1179545X1985836
Author(s):  
Masatomo Ebina ◽  
Kazunori Fujino ◽  
Akira Inoue ◽  
Koichi Ariyoshi ◽  
Yutaka Eguchi

Background:Severe sepsis is commonly associated with mortality among critically ill patients and is known to cause coagulopathy. While antithrombin is an anticoagulant used in this setting, serum albumin levels are known to influence serum antithrombin levels. Therefore, this study aimed to evaluate the outcomes of antithrombin supplementation in patients with sepsis-associated coagulopathy, as well as the relationship between serum albumin levels and the effects of antithrombin supplementation.Methods:This retrospective study evaluated patients who were >18 years of age and had been admitted to either of two intensive care units for sepsis-associated coagulopathy. The groups that did and did not receive antithrombin supplementation were compared for outcomes up to 1 year after admission. Subgroup analyses were performed for patients with serum albumin levels of <2.5 g/dL or ⩾2.5 g/dL.Results:Fifty-one patients received antithrombin supplementation and 163 patients did not. The Cox proportional hazards model revealed that antithrombin supplementation was independently associated with 28-day survival (hazard ratio [HR]: 0.374, P = 0.025) but not with 1 year survival (HR: 0.915, P = 0.752). In addition, among patients with serum albumin levels of <2.5 g/dL, antithrombin supplementation was associated with a significantly lower 28-day mortality rate (9.4% vs 36.8%, P = .009).Conclusion:Antithrombin supplementation may improve short-term survival, but not long-term survival, among patients with sepsis-associated coagulopathy.


2021 ◽  
Vol 5 (Supplement_2) ◽  
pp. 1110-1110
Author(s):  
Dong Zhen ◽  
John Jr Richie ◽  
Xiang Gao ◽  
Biyi Shen ◽  
David Orentreich

Abstract Objectives Increasing evidence in animal models and humans suggests that diets high in sulfur-containing amino acids (SAA) could be associated a greater risk for type 2 diabetes (T2D). However, data from longitudinal human studies linking dietary SAA intake with T2D is lacking. The present study aimed to examine the association between long-term dietary intake of SAA including total SAAs, methionine, and cysteine and incident T2D in participants of the Framingham Heart Study (FHS). Methods Adult participants were selected from two prospective FHS cohorts: The Offspring Cohort (followed from 1991 to 2015, n = 3799) and the Third Generation Cohort (followed from 2002 to 2011, n = 4096). Individuals identified as diabetes patients before baseline, having missing diet or covariates data, or reported extreme daily energy intake were excluded. Energy-adjusted intake of dietary SAAs was calculated from responses to a 131-item food frequency questionnaire. Cox proportional hazards models were used to evaluate associations between intakes of SAAs (in quintiles) and risk of T2D in each cohort. A combined analysis was also performed pooling subjects from both cohorts. Results Overall, we documented 471 T2D events during 9–23 years of follow-up. In both cohorts, higher SAA intake was associated with a higher risk of T2D after adjustment for demographics, traditional risk factors and related nutrients. Comparing participants in the highest quintile with those in the lowest quintile of intake, adjusted hazard ratios (95% CI) were 1.98 (1.15–3.41) for total intake (P for trend = 0.04) in the Offspring cohort, and 4.37 (1.40–13.67) (P for trend = 0.01) in the Third Generation cohort. In the combination analysis of two cohorts, adjusted hazard ratios (95% CI) were 1.98 (1.23–3.21) for total intake, 2.21 (1.38–3.53) for methionine, and 1.79 (1.12–2.87) for cysteine (P for trends &lt; 0.03). Conclusions Higher long-term SAA intake was associated with higher risk for T2D in humans, suggesting that dietary patterns emphasizing low SAA intake are protective against development of T2D. Funding Sources No funding.


2021 ◽  
pp. 1-14 ◽  
Author(s):  
Olga Mitelman ◽  
Hoda Z. Abdel-Hamid ◽  
Barry J. Byrne ◽  
Anne M. Connolly ◽  
Peter Heydemann ◽  
...  

Background: Studies 4658-201/202 (201/202) evaluated treatment effects of eteplirsen over 4 years in patients with Duchenne muscular dystrophy and confirmed exon-51 amenable genetic mutations. Chart review Study 4658-405 (405) further followed these patients while receiving eteplirsen during usual clinical care. Objective: To compare long-term clinical outcomes of eteplirsen-treated patients from Studies 201/202/405 with those of external controls. Methods: Median total follow-up time was approximately 6 years of eteplirsen treatment. Outcomes included loss of ambulation (LOA) and percent-predicted forced vital capacity (FVC%p). Time to LOA was compared between eteplirsen-treated patients and standard of care (SOC) external controls and was measured from eteplirsen initiation in 201/202 or, in the SOC group, from the first study visit. Comparisons were conducted using univariate Kaplan-Meier analyses and log-rank tests, and multivariate Cox proportional hazards models with regression adjustment for baseline characteristics. Annual change in FVC%p was compared between eteplirsen-treated patients and natural history study patients using linear mixed models with repeated measures. Results: Data were included from all 12 patients in Studies 201/202 and the 10 patients with available data from 405. Median age at LOA was 15.16 years. Eteplirsen-treated patients experienced a statistically significant longer median time to LOA by 2.09 years (5.09 vs. 3.00 years, p < 0.01) and significantly attenuated rates of pulmonary decline vs. natural history patients (FVC%p change: –3.3 vs. –6.0 percentage points annually, p < 0.0001). Conclusions: Study 405 highlights the functional benefits of eteplirsen on ambulatory and pulmonary function outcomes up to 7 years of follow-up in comparison to external controls.


2020 ◽  
Author(s):  
Yingting Zuo ◽  
Anxin Wang ◽  
Shuohua Chen ◽  
Xue Tian ◽  
Shouling Wu ◽  
...  

Abstract Background The relationship between estimated glomerular filtration rate (eGFR) trajectories and myocardial infarction (MI) remains unclear in people with diabetes or prediabetes. We aimed to identify common eGFR trajectories in people with diabetes or prediabetes and to examine their association with MI risk. Methods The data of this analysis was derived from the Kailuan study, which was a prospective community-based cohort study. The eGFR trajectories of 24,723 participants from year 2006 to 2012 were generated by latent mixture modeling. Incident cases of MI occurred during 2012 to 2017, confirmed by review of medical records. Cox proportional hazards models were used to calculate hazard ratios (HR) and their 95% confidence intervals (CIs) for the subsequent risk of MI of different eGFR trajectories. Results We identified 5 distinct eGFR trajectories, and named them as low-stable (9.4%), moderate-stable (31.4%), moderate-increasing (29.5%), high-decreasing (13.9%) and high-stable (15.8%) according to their range and pattern. During a mean follow-up of 4.61 years, there were a total of 235 incident MI. Although, the high-decreasing group had similar eGFR levels with the moderate-stable group at last exposure period, the risk was much higher (adjusted HR, 3.43; 95%CI, 1.56–7.54 versus adjusted HR, 2.82; 95%CI, 1.34–5.95). Notably, the moderate-increasing group had reached to the normal range, still had a significantly increased risk (adjusted HR, 2.55; 95%CI, 1.21–5.39). Conclusions eGFR trajectories were associated with MI risk in people with diabetes or prediabetes. Emphasis should be placed on early and long-term detection and control of eGFR decreases to further reduce MI risk.


2021 ◽  
Vol 5 (Supplement_1) ◽  
pp. 936-936
Author(s):  
Jure Mur ◽  
Simon Cox ◽  
Riccardo Marioni ◽  
Tom Russ ◽  
Graciela Muniz-Terrera

Abstract Previous studies on the association between the long-term use of anticholinergic drugs and dementia report heterogenous results. This variability could be due to, among other factors, different anticholinergic scales used, and differential effects of distinct classes of anticholinergic drugs. Here, we use 171,775 participants of UK Biobank with linked GP prescription records to calculate the cumulative annual anticholinergic burden (ACB) and ascertain dementia diagnoses through GP- and inpatient records. We then use Cox proportional hazards models to compare 13 anticholinergic scales and anticholinergic burden (ACB) due to different classes of drugs in their association with dementia. We find dementia to be more strongly predicted by ACB than by polypharmacy across most anticholinergic scales (standardised ORs range: 1.027-1.125). Furthermore, not only the baseline ACB, but the slope of the longitudinal trajectory of ACB (HR=1.094; 95% CI: 1.068-1.119) is predictive of dementia. However, the association between ACB and dementia holds only for some classes of drugs – especially antidepressants, antiepileptics, and high-ceiling antidiuretics. Moreover, we do not find a clear relationship between reported anticholinergic potency and dementia risk. The heterogeneity in findings on the association between ACB and dementia may in part be due to different effects for different classes of drugs. Future studies should establish such differences in more detail and further examine the practicality of using a general measure of anticholinergic potency as it relates to the risk of dementia.


Author(s):  
Urvi Rana ◽  
Matt Driedger ◽  
Paul Sereda ◽  
Shenyi Pan ◽  
Erin Ding ◽  
...  

Background: The clinical and demographic characteristics that predict antiretroviral efficacy among patients co-infected with HIV and hepatitis B virus (HBV) remain poorly defined. We evaluated HIV virological suppression and rebound in a cohort of HIV–HBV co-infected patients initiated on antiretroviral therapy. Methods: A retrospective cohort analysis was performed with Canadian Observation Cohort Collaboration data. Cox proportional hazards models were used to determine the factors associated with time to virological suppression and time to virological rebound. Results: HBV status was available for 2,419 participants. A total of 8% were HBV co-infected, of whom 95% achieved virological suppression. After virological suppression, 29% of HIV–HBV co-infected participants experienced HIV virological rebound. HBV co-infection itself did not predict virological suppression or rebound risk. The rate of virological suppression was lower among patients with a history of injection drug use or baseline CD4 cell counts of <199 cells per cubic millimetre. Low baseline HIV RNA and men-who-have-sex-with-men status were significantly associated with a higher rate of virological suppression. Injection drug use and non-White race predicted viral rebound. Conclusions: HBV co-infected HIV patients achieve similar antiretroviral outcomes as those living with HIV mono-infection. Equitable treatment outcomes may be approached by targeting resources to key subpopulations living with HIV–HBV co-infection.


2021 ◽  
Vol 9 (3) ◽  
pp. e002218
Author(s):  
Gaopeng Li ◽  
Guoliang Wang ◽  
Fenqing Chi ◽  
Yuqi Jia ◽  
Xi Wang ◽  
...  

BackgroundThe satisfactory prognostic indicator of gastric cancer (GC) patients after surgery is still lacking. Perioperative plasma extracellular vesicular programmed cell death ligand-1 (ePD-L1) has been demonstrated as a potential prognosis biomarker in many types of cancers. The prognostic value of postoperative plasma ePD-L1 has not been characterized.MethodsWe evaluated the prognostic value of preoperative, postoperative and change in plasma ePD-L1, as well as plasma soluble PD-L1, in short-term survival of GC patients after surgery. The Kaplan-Meier survival model and Cox proportional hazards models for both univariate and multivariate analyzes were used. And the comparison between postoperative ePD-L1 and conventional serum biomarkers (carcinoembryonic antigen (CEA), cancer antigen 19–9 (CA19-9) and CA72-4) in prognostic of GC patients was made.ResultsThe prognostic value of postoperative ePD-L1 is superior to that of preoperative ePD-L1 on GC patients after resection, and also superior to that of conventional serum biomarkers (CEA, CA19-9 and CA72-4). The levels of postoperative ePD-L1 and ePD-L1 change are independent prognostic factors for overall survival and recurrence free survival of GC patients. High plasma level of postoperative ePD-L1 correlates significantly with poor survival, while high change in ePD-L1 level brings the significant survival benefit.ConclusionsThe level of plasma postoperative ePD-L1 could be considered as a candidate prognostic biomarker of GC patients after resection.


Sign in / Sign up

Export Citation Format

Share Document