Decline of increased risk donor offers increases waitlist mortality in paediatric heart transplantation

2021 ◽  
pp. 1-10
Author(s):  
Jordan E. Ezekian ◽  
Michael S. Mulvihill ◽  
Brian Ezekian ◽  
Morgan L. Cox ◽  
Sonya Kirmani ◽  
...  

Abstract Background: Increased risk donors in paediatric heart transplantation have characteristics that may increase the risk of infectious disease transmission despite negative serologic testing. However, the risk of disease transmission is low, and refusing an IRD offer may increase waitlist mortality. We sought to determine the risks of declining an initial IRD organ offer. Methods and results: We performed a retrospective analysis of candidates waitlisted for isolated PHT using 20072017 United Network of Organ Sharing datasets. Match runs identified candidates receiving IRD offers. Competing risks analysis was used to determine mortality risk for those that declined an initial IRD offer with stratified Cox regression to estimate the survival benefit associated with accepting initial IRD offers. Overall, 238/1067 (22.3%) initial IRD offers were accepted. Candidates accepting an IRD offer were younger (7.2 versus 9.8 years, p < 0.001), more often female (50 versus 41%, p = 0.021), more often listed status 1A (75.6 versus 61.9%, p < 0.001), and less likely to require mechanical bridge to PHT (16% versus 23%, p = 0.036). At 1- and 5-year follow-up, cumulative mortality was significantly lower for candidates who accepted compared to those that declined (6% versus 13% 1-year mortality and 15% versus 25% 5-year mortality, p = 0.0033). Decline of an IRD offer was associated with an adjusted hazard ratio for mortality of 1.87 (95% CI 1.24, 2.81, p < 0.003). Conclusions: IRD organ acceptance is associated with a substantial survival benefit. Increasing acceptance of IRD organs may provide a targetable opportunity to decrease waitlist mortality in PHT.

2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
E Havers-Borgersen ◽  
J.H Butt ◽  
M Groening ◽  
M Smerup ◽  
G.H Gislason ◽  
...  

Abstract Introduction Patients with tetralogy of Fallot (ToF) are considered at high risk of infective endocarditis (IE) as a result of altered hemodynamics and multiple surgical and interventional procedures including pulmonary valve replacement (PVR). The overall survival of patients with ToF has increased in recent years. However, data on the risk of adverse outcomes including IE are sparse. Purpose To investigate the risk of IE in patients with ToF compared with controls from the background population. Methods In this nationwide observational cohort study, all patients with ToF born in 1977–2017 were identified using Danish nationwide registries and followed from date of birth until occurrence of an outcome of interest (i.e. first-time IE), death, or end of study (July 31, 2017). The comparative risk of IE among ToF patients versus age- and sex-matched controls from the background population was assessed. Results A total of 1,156 patients with ToF were identified and matched with 4,624 controls from the background population. Among patients with ToF, 266 (23.0%) underwent PVR during follow-up. During a median follow-up time of 20.4 years, 38 (3.3%) patients and 1 (0.03%) control were admitted with IE. The median time from date of birth to IE was 10.8 years (25th-75th percentile 2.8–20.9 years). The incidence rates of IE per 1,000 person-years were 2.2 (95% confidence interval (CI) 1.6–3.0) and 0.01 (95% CI 0.0001–0.1) among patients and controls, respectively. In multivariable Cox regression models, in which age, sex, pulmonary valve replacement, and relevant comorbidities (i.e. chronic renal failure, diabetes mellitus, presence of cardiac implantable electronic devices, other valve surgeries), were included as time-varying coefficients, the risk of IE was significantly higher among patients compared with controls (HR 171.5, 95% CI 23.2–1266.7). Moreover, PVR was associated with an increased risk of IE (HR 3.4, 95% CI 1.4–8.2). Conclusions Patients with ToF have a substantial risk of IE and the risk is significantly higher compared with the background population. In particular, PVR was associated with an increased risk of IE. With an increasing life-expectancy of these patients, intensified awareness, preventive measures, and surveillance of this patient group are advisable. Figure 1. Cumulative incidence of IE Funding Acknowledgement Type of funding source: None


Antioxidants ◽  
2021 ◽  
Vol 10 (7) ◽  
pp. 1102
Author(s):  
Angelica Rodriguez-Niño ◽  
Diego O. Pastene ◽  
Adrian Post ◽  
M. Yusof Said ◽  
Antonio W. Gomes-Neto ◽  
...  

Carnosine affords protection against oxidative and carbonyl stress, yet high concentrations of the carnosinase-1 enzyme may limit this. We recently reported that high urinary carnosinase-1 is associated with kidney function decline and albuminuria in patients with chronic kidney disease. We prospectively investigated whether urinary carnosinase-1 is associated with a high risk for development of late graft failure in kidney transplant recipients (KTRs). Carnosine and carnosinase-1 were measured in 24 h urine in a longitudinal cohort of 703 stable KTRs and 257 healthy controls. Cox regression was used to analyze the prospective data. Urinary carnosine excretions were significantly decreased in KTRs (26.5 [IQR 21.4–33.3] µmol/24 h versus 34.8 [IQR 25.6–46.8] µmol/24 h; p < 0.001). In KTRs, high urinary carnosinase-1 concentrations were associated with increased risk of undetectable urinary carnosine (OR 1.24, 95%CI [1.06–1.45]; p = 0.007). During median follow-up for 5.3 [4.5–6.0] years, 84 (12%) KTRs developed graft failure. In Cox regression analyses, high urinary carnosinase-1 excretions were associated with increased risk of graft failure (HR 1.73, 95%CI [1.44–2.08]; p < 0.001) independent of potential confounders. Since urinary carnosine is depleted and urinary carnosinase-1 imparts a higher risk for graft failure in KTRs, future studies determining the potential of carnosine supplementation in these patients are warranted.


Diabetologia ◽  
2021 ◽  
Author(s):  
Peter Ueda ◽  
Viktor Wintzell ◽  
Mads Melbye ◽  
Björn Eliasson ◽  
Ann-Marie Svensson ◽  
...  

Abstract Aims/hypothesis Concerns have been raised regarding a potential association of use of the incretin-based drugs dipeptidyl peptidase 4 (DPP4) inhibitors and glucagon-like peptide-1 (GLP-1)-receptor agonists with risk of cholangiocarcinoma. We examined this association in nationwide data from three countries. Methods We used data from nationwide registers in Sweden, Denmark and Norway, 2007–2018, to conduct two cohort studies, one for DPP4 inhibitors and one for GLP-1-receptor agonists, to investigate the risk of incident cholangiocarcinoma compared with an active-comparator drug class (sulfonylureas). The cohorts included patients initiating treatment episodes with DPP4 inhibitors vs sulfonylureas, and GLP-1-receptor agonists vs sulfonylureas. We used Cox regression models, adjusted for potential confounders, to estimate hazard ratios from day 366 after treatment initiation to account for cancer latency. Results The main analyses of DPP4 inhibitors included 1,414,144 person-years of follow-up from 222,577 patients receiving DPP4 inhibitors (median [IQR] follow-up time, 4.5 [2.6–7.0] years) and 123,908 patients receiving sulfonylureas (median [IQR] follow-up time, 5.1 [2.9–7.8] years) during which 350 cholangiocarcinoma events occurred. Use of DPP4 inhibitors, compared with sulfonylureas, was not associated with a statistically significant increase in risk of cholangiocarcinoma (incidence rate 26 vs 23 per 100,000 person-years; adjusted HR, 1.15 [95% CI 0.90, 1.46]; absolute rate difference 3 [95% CI -3, 10] events per 100,000 person-years). The main analyses of GLP-1-receptor agonists included 1,036,587 person-years of follow-up from 96,813 patients receiving GLP-1-receptor agonists (median [IQR] follow-up time, 4.4 [2.4–6.9] years) and 142,578 patients receiving sulfonylureas (median [IQR] follow-up time, 5.5 [3.2–8.1] years) during which 249 cholangiocarcinoma events occurred. Use of GLP-1-receptor agonists was not associated with a statistically significant increase in risk of cholangiocarcinoma (incidence rate 26 vs 23 per 100,000 person-years; adjusted HR, 1.25 [95% CI 0.89, 1.76]; absolute rate difference 3 [95% CI -5, 13] events per 100,000 patient-years). Conclusions/interpretation In this analysis using nationwide data from three countries, use of DPP4 inhibitors and GLP-1-receptor agonists, compared with sulfonylureas, was not associated with a significantly increased risk of cholangiocarcinoma. Graphical abstract


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
S.O Troebs ◽  
A Zitz ◽  
S Schwuchow-Thonke ◽  
A Schulz ◽  
M.W Heidorn ◽  
...  

Abstract Background Global longitudinal strain (GLS) demonstrated a superior prognostic value over left ventricular ejection fraction (LVEF) in acute heart failure (HF). Its prognostic value across American Heart Association (AHA) stages of HF – especially under considering of conventional echocardiographic measures of systolic and diastolic function – has not yet been comprehensively evaluated. Purpose To evaluate the prognostic value of GLS for HF-specific outcome across AHA HF stages A to D. Methods Data from the MyoVasc-Study (n=3,289) were analysed. Comprehensive clinical phenotyping was performed during a five-hour investigation in a dedicated study centre. GLS was measured offline utilizing QLab 9.0.1 (PHILIPS, Germany) in participants presenting with sinus rhythm during echocardiography. Worsening of HF (comprising transition from asymptomatic to symptomatic HF, HF hospitalization, and cardiac death) was assessed during a structured follow-up with subsequent validation and adjudication of endpoints. AHA stages were defined according to current guidelines. Results Complete information on GLS was available in 2,400 participants of whom 2,186 categorized to AHA stage A to D were available for analysis. Overall, 434 individuals were classified as AHA stage A, 629 as stage B and 1,123 as stage C/D. Mean GLS increased across AHA stages of HF: it was lowest in stage A (−19.44±3.15%), −18.01±3.46% in stage B and highest in AHA stage C/D (−15.52±4.64%, P for trend &lt;0.0001). During a follow-up period of 3.0 [1.3/4.0] years, GLS denoted an increased risk for worsening of HF after adjustment for age and sex (hazard ratio, HRGLS [per standard deviation (SD)] 1.97 [95% confidence interval 1.73/2.23], P&lt;0.0001) in multivariable Cox regression analysis. After additional adjustment for cardiovascular risk factors, clinical profile, LVEF and E/E' ratio, GLS was the strongest echocardiographic predictor of worsening of HF (HRGLS [per SD] 1.47 [1.20/1.80], P=0.0002) in comparison to LVEF (HRLVEF [per SD] 1.23 [1.02/1.48], P=0.031) and E/E' ratio (HRE/E' [per SD] 1.12 [0.99/1.26], P=0.083). Interestingly, when stratifying for AHA stages, GLS denoted a similar increased risk for worsening of HF in individuals classified as AHA stage A/B (HRGLS [per SD] 1.63 [1.02/2.61], P=0.039) and in those classified as AHA stage C/D (HRGLS [per SD] 1.95 [1.65/2.29], P&lt;0.0001) after adjustment for age and sex. For further evaluation, Cox regression models with interaction analysis indicated no significant interaction for (i) AHA stage A/B vs C/D (P=0.83) and (ii) NYHA functional class &lt;II vs ≥II in individuals classified as AHA stage C/D (P=0.12). Conclusions GLS demonstrated a higher predictive value for worsening of HF than conventional echocardiographic measures of systolic and diastolic function. Interestingly, GLS indicated an increased risk for worsening of HF across AHA stages highlighting its potential value to advance risk prediction in chronic HF. Funding Acknowledgement Type of funding source: Public grant(s) – National budget only. Main funding source(s): German Center for Cardiovascular Research (DZHK), Center for Translational Vascular Biology (CTVB) of the University Medical Center of the Johannes Gutenberg-University Mainz


Circulation ◽  
2015 ◽  
Vol 132 (suppl_3) ◽  
Author(s):  
Kairav Vakil ◽  
Rebecca Cogswell ◽  
Sue Duval ◽  
Wayne Levy ◽  
Peter Eckman ◽  
...  

Background: Current guidelines do not support routine use of implantable cardioverter-defibrillators (ICDs) in patients (pts) with end-stage heart failure (HF), unless these pts are awaiting advanced HF therapies such as left ventricular assist devices (LVADs) or a heart transplantation (HT). Whether ICDs improve survival in end-stage HF pts awaiting HT has not been previously examined in a large, multicenter cohort. Hypothesis: Presence of ICDs at time of listing for HT is associated with lower waitlist mortality. Methods: The United Network for Organ Sharing registry was used to identify adults (≥18 years) listed for HT between January 4, 1999 & September 30, 2014. Pts with congenital heart disease, total artificial heart, restrictive cardiomyopathy, prior HT, or missing covariates were excluded. Cox regression analysis was used to assess the impact of an ICD at the time of listing on waitlist mortality. Results: The analysis included 36,397 pts (mean age 53±12; 77% males) listed for HT. The prevalence of ICDs at listing has steadily increased over time before reaching a plateau in 2006 (27% in 1999, and range 76-82% between 2006-2014). In the unadjusted model, ICD use was associated with a 36% reduction in waitlist mortality (HR 0.64, 95% CI 0.60-0.68, p<0.001). After adjustment for covariates such as age, sex, race, creatinine, ischemic cardiomyopathy, and listing status, this association was nearly unchanged (HR 0.67, 95% CI 0.62-0.72, p<0.001). Test for interaction by listing era (pre- and post-2006) was non-significant (p=0.28). In the final adjusted model, that included listing era and LVAD status in addition to the above listed covariates, ICD use continued to remain associated with a mortality benefit on the waitlist for HT (HR 0.84, 95% CI 0.78-0.91, p<0.001). Conclusion: ICDs are increasingly prevalent in pts listed for HT; however many pts are still listed for HT without these devices. The presence of an ICD at the time of listing is associated with lower mortality on the waitlist. Although the magnitude of ICD efficacy diminishes slightly, its benefit continues to remain significant even after adjustment for listing era and LVAD use. Further analyses are required to identify specific sub-groups of pts where ICD use is most beneficial and appropriate.


2019 ◽  
Vol 6 (1) ◽  
pp. e000339 ◽  
Author(s):  
Fangfang Sun ◽  
Yi Chen ◽  
Wanlong Wu ◽  
Li Guo ◽  
Wenwen Xu ◽  
...  

ObjectiveTo explore whether varicella zoster virus (VZV) infection could increase the risk of disease flares in patients with SLE.MethodsPatients who had VZV reactivations between January 2013 and April 2018 were included from the SLE database (n=1901) of Shanghai Ren Ji Hospital, South Campus. Matched patients with SLE were selected as background controls with a 3:1 ratio. Patients with SLE with symptomatic bacterial infections of the lower urinary tract (UTI) were identified as infection controls. Baseline period and index period were defined as 3 months before and after infection event, respectively. Control period was the following 3 months after the index period. Flare was defined by SELENA SLEDAI Flare Index. Kaplan-Meier analysis, Cox regression model and propensity score weighting were applied.ResultsPatients with VZV infections (n=47), UTI controls (n=28) and matched SLE background controls (n=141) were included. 16 flares (34%) in the VZV group within the index period were observed, as opposed to only 7.1% in UTI controls and 9.9% in background controls. Kaplan-Meier curve revealed that patients with a VZV infection had a much lower flare-free survival within the index period compared with the controls (p=0.0003). Furthermore, after adjusting for relevant confounders including baseline disease activity and intensity of immunosuppressive therapy, Cox regression analysis and propensity score weighting confirmed that VZV infection within 3 months was an independent risk factor for SLE flares (HR 3.70 and HR 4.16, respectively).ConclusionsIn patients with SLE, recent VZV infection within 3 months was associated with increased risk of disease flares.


Author(s):  
Elliot Spicer ◽  
BCIT School of Health Sciences, Environmental Health ◽  
Helen Heacock

Background: Seniors participate in sports to improve physical, mental, and social health; however, such activities may increase the risk of illness and injury. Curling is popular in this age group because it is physically manageable, strategic, and provides social connection. Certain factors in curling such as handshaking, play during the flu season, and shared contact with curling stones suggest an increased risk of disease transmission. The purpose of this study was to determine the qualitative risk of communicable enteric disease transmission due to shared contact with curling stone handles in a senior men’s curling league. Methods: 3M™ Quick Swabs were used to sample 22 curling stone handles for total coliforms before a senior’s league game. To analyze microbial shedding during gameplay, the same 22 handles were sampled after the game. Samples were plated on 3M™ Petrifilm™ Coliform Count Plates and incubated at 30ºC ± 1ºC for 24 hours ± 2 hours. Colonies were enumerated in units of CFU (colony forming units)/cm2. Ambient and handle surface temperatures were measured, and curler hygiene-related behaviours documented. Results: Total coliform counts for all samples were 0 CFU/cm2. The ambient temperature was 6.6°C pre-game, and 8.0°C post-game. Mean handle surface temperature was 3.6°C. Hygiene behaviours of concern were hand-face contact, handkerchief/tissue use, and handshaking. Conclusion: There is low risk of enteric disease transmission due to shared contact with curling stone handles by male curlers 55 years and older. Absence of coliforms may have been due to adequate player hygiene, transference of microbial load before sampling, error, or environmental conditions. Health promotion and education can reduce the infection risk elevated by poor hand hygiene, face contact, and handshaking in senior’s curling, thereby protecting the health and welfare of all participants.


Circulation ◽  
2021 ◽  
Vol 143 (Suppl_1) ◽  
Author(s):  
Jeffrey R Misialek ◽  
Elizabeth R Stremke ◽  
Elizabeth Selvin ◽  
Sanaz Sedaghat ◽  
James S Pankow ◽  
...  

Introduction: Diabetes is a major risk factor for cardiovascular disease. Osteocalcin is a vitamin K-dependent, bone-derived hormone that functions as an endocrine regulator of energy metabolism, male fertility, and cognition. Early studies of endocrine effects of osteocalcin have shown that genomic deletion of osteocalcin in mice resulted in a diabetic phenotype (i.e. glucose intolerance, and insulin resistance). However, results from clinical studies have shown mixed associations between blood levels of osteocalcin and risk of incident type 2 diabetes mellitus. Hypothesis: Lower values of plasma osteocalcin would be associated with an increased risk of diabetes. Methods: A total of 11,557 ARIC participants without diabetes at baseline were followed from ARIC visit 3 (1993-1995) through 2018. Diabetes cases were identified through self-report on annual and semi-annual follow-up phone calls. Plasma osteocalcin data was measured using an aptamer-based proteomic profiling platform (SomaLogic). We used Cox regression to evaluate the association of quintiles of plasma osteocalcin and incident diabetes. The primary model adjusted for age, sex, and race-center. Results: Participants were age 60 ± 5.6 years at visit 3, 56% identified as female, 21% identified as Black. There were 3,031 incident diabetes cases over a median follow-up of 17.9 years. Mean ± SD was 10.053 ± 0.775. When comparing the highest quintile of plasma osteocalcin (values 10.42 to 14.66) to the lowest quintile (values 9.03 to 9.52), there was no association with incident diabetes (HRs [95% CIs]: 0.92 [0.81, 1.02]). There was also no significant trend across the quintiles (p = 0.19). Results were similar when adjusting for additional potential confounders, and when limiting the follow-up time to 10 years. Conclusions: These data do not support the hypothesis that total plasma osteocalcin, as measured by Somalogic proteomic panel, is a biomarker associated with diabetes risk. It is possible that total plasma or serum osteocalcin and/or other isoforms of osteocalcin protein (i.e. gamma carboxylated or uncarboxylated osteocalcin) measured via other validated methodologies may be linked to diabetes.


Author(s):  
Rongrong Wei ◽  
Xinyu Du ◽  
Jing Wang ◽  
Qi Wang ◽  
Xiaojie Zhu ◽  
...  

Introduction: The incidence and prognostic impact of subsequent primary gastric cancer (GC) in a population of other cancer survivors is unclear. We aimed to evaluate susceptibility to subsequent primary GC in cancer survivors and prognosis of GC with prior cancer history. Methods: 2,211 and 23,416 GC cases with and without prior cancer history were retrospectively selected from the Surveillance, Epidemiology and End Results (SEER) database. Potential risk of developing subsequent primary GC was assessed through standardized incidence ratios (SIRs). Cox regression were adopted to analyze the influence of prior cancer history and clinical characteristic factors on the prognosis of subsequent primary GC. A nomogram was established to predict overall survival (OS). Propensity score matching (PSM) was conducted to eliminate possible bias. Results: Compared with general population, cancer survivors had an increased risk of subsequent primary GC (SIR 1.17, 95% CI 1.15-1.20, P<0.05). Prior cancer history was related to poor OS of GC [adjusted hazard ratio (aHR) 1.12, 95% CI 1.06-1.19, P<0.001], but not cancer-specific survival (aHR 0.97, 95% CI 0.89-1.05, P=0.441). In addition, age, grade, stage, year of diagnosis, surgery, TNM stage and tumor size were independent prognostic factors for OS in GC cases with prior cancers. The concordance index of the nomogram was 0.72 (95% CI 0.71-0.74), and calibrate curves showed good agreement between prediction by the nomogram and actual observation. Conclusions: Cancer survivors with increased risk of developing subsequent primary GC should strengthen their monitoring and follow-up to prevent occurrence of subsequent primary gastric cancer.


Author(s):  
Catherine A. Marco

Patients may present to the emergency department (ED) for various complaints and requests related to risky sexual behavior. Such concerns may include pregnancy or infectious disease transmission, including urethritis, cervicitis, HIV, hepatitis, or others. Emergency physicians should test for pregnancy and infectious diseases, treat empirically for appropriate patients, and refer patients for counseling to reduce risky sexual behavior. Following a significant potential HIV exposure, postexposure prophylaxis (PEP) should be considered. The decision to administer PEP should be based on shared decision-making with the patient and should include assessment of the risk of the exposure and HIV status of the source patient. If the HIV status of the source patient is unknown, the source should be tested following informed consent and counseling. Patients should be referred to outpatient follow-up, including primary care, infectious disease, and if indicated, social services.


Sign in / Sign up

Export Citation Format

Share Document