scholarly journals Frailty and Access to Kidney Transplantation

2019 ◽  
Vol 14 (4) ◽  
pp. 576-582 ◽  
Author(s):  
Christine E. Haugen ◽  
Nadia M. Chu ◽  
Hao Ying ◽  
Fatima Warsame ◽  
Courtenay M. Holscher ◽  
...  

Background and objectivesFrailty, a syndrome distinct from comorbidity and disability, is clinically manifested as a decreased resistance to stressors and is present in up to 35% of patient with ESKD. It is associated with falls, hospitalizations, poor cognitive function, and mortality. Also, frailty is associated with poor outcomes after kidney transplant, including delirium and mortality. Frailty is likely also associated with decreased access to kidney transplantation, given its association with poor outcomes on dialysis and post-transplant. Yet, clinicians have difficulty identifying which patients are frail; therefore, we sought to quantify if frail kidney transplant candidates had similar access to kidney transplantation as nonfrail candidates.Design, setting, participants, & measurementsWe studied 7078 kidney transplant candidates (2009–2018) in a three-center prospective cohort study of frailty. Fried frailty (unintentional weight loss, grip strength, walking speed, exhaustion, and activity level) was measured at outpatient kidney transplant evaluation. We estimated time to listing and transplant rate by frailty status using Cox proportional hazards and Poisson regression, adjusting for demographic and health factors.ResultsThe mean age was 54 years (SD 13; range, 18–89), 40% were women, 34% were black, and 21% were frail. Frail participants were almost half as likely to be listed for kidney transplantation (hazard ratio, 0.62; 95% confidence interval, 0.56 to 0.69; P<0.001) compared with nonfrail participants, independent of age and other demographic factors. Furthermore, frail candidates were transplanted 32% less frequently than nonfrail candidates (incidence rate ratio, 0.68; 95% confidence interval, 0.58 to 0.81; P<0.001).ConclusionsFrailty is associated with lower chance of listing and lower rate of transplant, and is a potentially modifiable risk factor.

2021 ◽  
pp. 152692482110246
Author(s):  
Amanda Vinson ◽  
Alyne Teixeira ◽  
Bryce Kiberd ◽  
Karthik Tennankore

Background: Leukopenia occurs frequently following kidney transplantation and is associated with adverse clinical outcomes including increased infectious risk. In this study we sought to characterize the causes and complications of leukopenia following kidney transplantation. Methods: In a cohort of adult patients (≥18 years) who underwent kidney transplant from Jan 2006-Dec 2017, we used univariable Cox proportional Hazards models to identify predictors of post-transplant leukopenia (WBC < 3500 mm3). Factors associated with post-transplant leukopenia were then included in a multivariable backwards stepwise selection process to create a prediction model for the outcome of interest. Cox regression analyses were subsequently used to determine if post-transplant leukopenia was associated with complications. Results: Of 388 recipients, 152 (39%) developed posttransplant leukopenia. Factors associated with leukopenia included antithymocyte globulin as induction therapy (HR 3.32, 95% CI 2.25-4.91), valganciclovir (HR 1.84, 95% CI 1.25-2.70), tacrolimus (HR 3.05, 95% CI 1.08-8.55), prior blood transfusion (HR 1.17 per unit, 95% CI 1.09- 1.25), and donor age (HR 1.02 per year, 95% CI 1.00-1.03). Cytomegalovirus infection occurred in 26 patients with leukopenia (17.1%). Other than cytomegalovirus, leukopenia was not associated with posttransplant complications. Conclusion: Leukopenia commonly occurred posttransplant and was associated with modifiable and non-modifiable pretransplant factors.


2018 ◽  
Vol 13 (4) ◽  
pp. 628-637 ◽  
Author(s):  
Laura C. Plantinga ◽  
Raymond J. Lynch ◽  
Rachel E. Patzer ◽  
Stephen O. Pastan ◽  
C. Barrett Bowling

Background and objectivesSerious fall injuries in the setting of ESKD may be associated with poor access to kidney transplant. We explored the burden of serious fall injuries among patients on dialysis and patients on the deceased donor waitlist and the associations of these fall injuries with waitlisting and transplantation.Design, setting, participants, & measurementsOur analytic cohorts for the outcomes of (1) waitlisting and (2) transplantation included United States adults ages 18–80 years old who (1) initiated dialysis (n=183,047) and (2) were waitlisted for the first time (n=37,752) in 2010–2013. Serious fall injuries were determined by diagnostic codes for falls plus injury (fracture, joint dislocation, or head trauma) in inpatient and emergency department claims; the first serious fall injury after cohort entry was included as a time-varying exposure. Follow-up ended at the specified outcome, death, or the last date of follow-up (September 30, 2014). We used multivariable Cox proportional hazards models to determine the independent associations between serious fall injury and waitlisting or transplantation.ResultsOverall, 2-year cumulative incidence of serious fall injury was 6% among patients on incident dialysis; with adjustment, patients who had serious fall injuries were 61% less likely to be waitlisted than patients who did not (hazard ratio, 0.39; 95% confidence interval, 0.35 to 0.44). Among incident waitlisted patients (4% 2-year cumulative incidence), those with serious fall injuries were 29% less likely than their counterparts to be subsequently transplanted (hazard ratio, 0.71; 95% confidence interval, 0.63 to 0.80).ConclusionsSerious fall injuries among United States patients on dialysis are associated with substantially lower likelihood of waitlisting for and receipt of a kidney transplant.PodcastThis article contains a podcast at https://www.asn-online.org/media/podcast/CJASN/2018_03_06_CJASNPodcast_18_4_P.mp3


2021 ◽  
pp. 000486742110096
Author(s):  
Oleguer Plana-Ripoll ◽  
Patsy Di Prinzio ◽  
John J McGrath ◽  
Preben B Mortensen ◽  
Vera A Morgan

Introduction: An association between schizophrenia and urbanicity has long been observed, with studies in many countries, including several from Denmark, reporting that individuals born/raised in densely populated urban settings have an increased risk of developing schizophrenia compared to those born/raised in rural settings. However, these findings have not been replicated in all studies. In particular, a Western Australian study showed a gradient in the opposite direction which disappeared after adjustment for covariates. Given the different findings for Denmark and Western Australia, our aim was to investigate the relationship between schizophrenia and urbanicity in these two regions to determine which factors may be influencing the relationship. Methods: We used population-based cohorts of children born alive between 1980 and 2001 in Western Australia ( N = 428,784) and Denmark ( N = 1,357,874). Children were categorised according to the level of urbanicity of their mother’s residence at time of birth and followed-up through to 30 June 2015. Linkage to State-based registers provided information on schizophrenia diagnosis and a range of covariates. Rates of being diagnosed with schizophrenia for each category of urbanicity were estimated using Cox proportional hazards models adjusted for covariates. Results: During follow-up, 1618 (0.4%) children in Western Australia and 11,875 (0.9%) children in Denmark were diagnosed with schizophrenia. In Western Australia, those born in the most remote areas did not experience lower rates of schizophrenia than those born in the most urban areas (hazard ratio = 1.02 [95% confidence interval: 0.81, 1.29]), unlike their Danish counterparts (hazard ratio = 0.62 [95% confidence interval: 0.58, 0.66]). However, when the Western Australian cohort was restricted to children of non-Aboriginal Indigenous status, results were consistent with Danish findings (hazard ratio = 0.46 [95% confidence interval: 0.29, 0.72]). Discussion: Our study highlights the potential for disadvantaged subgroups to mask the contribution of urban-related risk factors to risk of schizophrenia and the importance of stratified analysis in such cases.


2021 ◽  
pp. 1-8
Author(s):  
Dominik Promny ◽  
Theresa Hauck ◽  
Aijia Cai ◽  
Andreas Arkudas ◽  
Katharina Heller ◽  
...  

<b><i>Background:</i></b> Obesity is frequently present in patients suffering from end-stage renal disease (ESRD). However, overweight kidney transplant candidates are a challenge for the transplant surgeon. Obese patients tend to develop a large abdominal panniculus after weight loss creating an area predisposed to wound-healing disorders. Due to concerns about graft survival and postoperative complications after kidney transplantation, obese patients are often refused in this selective patient cohort. The study aimed to analyze the effect of panniculectomies on postoperative complications and transplant candidacy in an interdisciplinary setting. <b><i>Methods:</i></b> A retrospective database review of 10 cases of abdominal panniculectomies performed in patients with ESRD prior to kidney transplantation was conducted. <b><i>Results:</i></b> The median body mass index was 35.2 kg/m<sup>2</sup> (range 28.5–53.0 kg/m<sup>2</sup>) at first transplant-assessment versus 31.0 kg/m<sup>2</sup> (range 28.0–34.4 kg/m<sup>2</sup>) at panniculectomy, and 31.6 kg/m<sup>2</sup> (range 30.3–32.4 kg/m<sup>2</sup>) at kidney transplantation. We observed no major postoperative complications following panniculectomy and minor wound-healing complications in 2 patients. All aside from 1 patient became active transplant candidates 6 weeks after panniculectomy. No posttransplant wound complications occurred in the transplanted patients. <b><i>Conclusion:</i></b> Abdominal panniculectomy is feasible in patients suffering ESRD with no major postoperative complications, thus converting previously ineligible patients into kidney transplant candidates. An interdisciplinary approach is advisable in this selective patient cohort.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Ashwin Radhakrishnan ◽  
Luke C. Pickup ◽  
Anna M. Price ◽  
Jonathan P. Law ◽  
Kirsty C. McGee ◽  
...  

Abstract Background Coronary microvascular dysfunction (CMD) is common in end-stage renal disease (ESRD) and is an adverse prognostic marker. Coronary flow velocity reserve (CFVR) is a measure of coronary microvascular function and can be assessed using Doppler echocardiography. Reduced CFVR in ESRD has been attributed to factors such as diabetes, hypertension and left ventricular hypertrophy. The contributory role of other mediators important in the development of cardiovascular disease in ESRD has not been studied. The aim of this study was to examine the prevalence of CMD in a cohort of kidney transplant candidates and to look for associations of CMD with markers of anaemia, bone mineral metabolism and chronic inflammation. Methods Twenty-two kidney transplant candidates with ESRD were studied with myocardial contrast echocardiography, Doppler CFVR assessment and serum multiplex immunoassay analysis. Individuals with diabetes, uncontrolled hypertension or ischaemic heart disease were excluded. Results 7/22 subjects had CMD (defined as CFVR < 2). Demographic, laboratory and echocardiographic parameters and serum biomarkers were similar between subjects with and without CMD. Subjects with CMD had significantly lower haemoglobin than subjects without CMD (102 g/L ± 12 vs. 117 g/L ± 11, p = 0.008). There was a positive correlation between haemoglobin and CFVR (r = 0.7, p = 0.001). Similar results were seen for haematocrit. In regression analyses, haemoglobin was an independent predictor of CFVR (β = 0.041 95% confidence interval 0.012–0.071, p = 0.009) and of CFVR < 2 (odds ratio 0.85 95% confidence interval 0.74–0.98, p = 0.022). Conclusions Among kidney transplant candidates with ESRD, there is a high prevalence of CMD, despite the absence of traditional risk factors. Anaemia may be a potential driver of microvascular dysfunction in this population and requires further investigation.


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Raquel Araujo-Gutierrez ◽  
Kalyan R. Chitturi ◽  
Jiaqiong Xu ◽  
Yuanchen Wang ◽  
Elizabeth Kinder ◽  
...  

Abstract Background Cancer therapy-related cardiac dysfunction (CTRD) is a major source of morbidity and mortality in long-term cancer survivors. Decreased GLS predicts decreased left ventricular ejection fraction (LVEF) in patients receiving anthracyclines, but knowledge regarding the clinical utility of baseline GLS in patients at low-risk of (CTRD) is limited. Objectives The purpose of this study was to investigate whether baseline echocardiographic assessment of global longitudinal strain (GLS) before treatment with anthracyclines is predictive of (CTRD) in a broad cohort of patients with normal baseline LVEF. Methods Study participants comprised 188 patients at a single institution who underwent baseline 2-dimensional (2D) speckle-tracking echocardiography before treatment with anthracyclines and at least one follow-up echocardiogram 3 months after chemotherapy initiation. Patients with a baseline LVEF <55% were excluded from the analysis. The primary endpoint, (CTRD), was defined as an absolute decline in LVEF > 10% from baseline and an overall reduced LVEF <50%. Potential and known risk factors were evaluated using univariable and multivariable Cox proportional hazards regression analysis. Results Twenty-three patients (12.23%) developed (CTRD). Among patients with (CTRD), the mean GLS was -17.51% ± 2.77%. The optimal cutoff point for (CTRD) was -18.05%. The sensitivity was 0.70 and specificity was 0.70. The area under ROC curve was 0.70. After adjustment for cardiovascular and cancer therapy related risk factors, GLS or decreased baseline GLS ≥-18% was predictive of (CTRD) (adjusted hazards ratio 1.17, 95% confidence interval 1.00, 1.36; p = 0.044 for GLS, or hazards ratio 3.54; 95% confidence interval 1.34, 9.35; p = 0.011 for decreased GLS), along with history of tobacco use, pre-chemotherapy systolic blood pressure, and cumulative anthracycline dose. Conclusions Baseline GLS or decreased baseline GLS was predictive of (CTRD) before anthracycline treatment in a cohort of cancer patients with a normal baseline LVEF. This data supports the implementation of strain-protocol echocardiography in cardio-oncology practice for identifying and monitoring patients who are at elevated risk of (CTRD).


Author(s):  
Winn Cashion ◽  
Walid F. Gellad ◽  
Florentina E. Sileanu ◽  
Maria K. Mor ◽  
Michael J. Fine ◽  
...  

Background and objectivesMany kidney transplant recipients enrolled in the Veterans Health Administration are also enrolled in Medicare and eligible to receive both Veterans Health Administration and private sector care. Where these patients receive transplant care and its association with mortality are unknown.Design, setting, participants, & measurementsWe conducted a retrospective cohort study of veterans who underwent kidney transplantation between 2008 and 2016 and were dually enrolled in Veterans Health Administration and Medicare at the time of surgery. We categorized patients on the basis of the source of transplant-related care (i.e., outpatient transplant visits, immunosuppressive medication prescriptions, calcineurin inhibitor measurements) delivered during the first year after transplantation defined as Veterans Health Administration only, Medicare only (i.e., outside Veterans Health Administration using Medicare), or dual care (mixed use of Veterans Health Administration and Medicare). Using multivariable Cox regression, we examined the independent association of post-transplant care source with mortality at 5 years after kidney transplantation.ResultsAmong 6206 dually enrolled veterans, 975 (16%) underwent transplantation at a Veterans Health Administration hospital and 5231 (84%) at a non–Veterans Health Administration hospital using Medicare. Post-transplant care was received by 752 patients (12%) through Veterans Health Administration only, 2092 (34%) through Medicare only, and 3362 (54%) through dual care. Compared with patients who were Veterans Health Administration only, 5-year mortality was significantly higher among patients who were Medicare only (adjusted hazard ratio, 2.2; 95% confidence interval, 1.5 to 3.1) and patients who were dual care (adjusted hazard ratio, 1.5; 95% confidence interval, 1.1 to 2.1).ConclusionsMost dually enrolled veterans underwent transplantation at a non–Veterans Health Administration transplant center using Medicare, yet many relied on Veterans Health Administration for some or all of their post-transplant care. Veterans who received Veterans Health Administration–only post-transplant care had the lowest 5-year mortality.


2020 ◽  
Vol 76 (1) ◽  
pp. 72-81 ◽  
Author(s):  
Nadia M. Chu ◽  
Zhan Shi ◽  
Christine E. Haugen ◽  
Silas P. Norman ◽  
Alden L. Gross ◽  
...  

2021 ◽  
Vol 36 (Supplement_1) ◽  
Author(s):  
Alexandre Candellier ◽  
Eric Jean Goffin ◽  
Priya Vart ◽  
Marlies Noordzij ◽  
Miha Arnol ◽  
...  

Abstract Background and Aims Studies examining kidney failure patients with COVID-19 reported higher mortality in hemodialysis patients than in kidney transplant recipients. However, hemodialysis patients are often older and have more comorbidities. This study investigated the association of type of kidney replacement therapy with COVID-19 severity adjusting for differences in characteristics. Method Data were retrieved from the European Renal Association COVID-19 Database (ERACODA), which includes kidney replacement therapy patients diagnosed with COVID-19 from all over Europe. We included all kidney transplant recipients and hemodialysis patients who presented between February 1st and December 1st 2020 and had complete information reason for COVID-19 screening and vital status at day 28. The diagnosis of COVID-19 was made based on a PCR of a nasal or pharyngeal swab specimens and/or COVID-19 compatible findings on a lung CT scan. The association of kidney transplantation or hemodialysis with 28-day mortality was examined using Cox proportional-hazards regression models adjusted for age, sex, frailty and comorbidities. Additionally, this association was investigated in the subsets of patients that were screened because of symptoms or have had routine screening. Results A total of 1,670 patients (496 functional kidney transplant recipients and 1,174 hemodialysis patients) were examined. 16.9% of kidney transplant recipients and 23.9% of hemodialysis patients died within 28 days of presentation. In an unadjusted model, the risk of 28-day mortality was 33% lower in kidney transplant recipients compared with hemodialysis patients (hazard ratio (HR): 0.67, 95% CI: 0.52, 0.85). However, in an age, sex and frailty adjusted model, the risk of 28-day mortality was 29% higher in kidney transplant recipients (HR=1.29, 95% CI: 1.00, 1.68), whereas in a fully adjusted model the risk was even 43% higher (HR=1.43, 95% CI: 1.06, 1.93). This association in patients who were screened because of symptoms (n=1,145) was similar (fully adjusted model HR=1.46, 95% CI: 1.05, 2.04). Results were similar when other endpoints were studied (e.g. risk for hospitalization, ICU admission or mortality beyond 28 days) as well as across subgroups. Only age was found to interact significantly, suggesting that the increased mortality risk associated with kidney transplantation was especially present in elderly subjects. Conclusion In this study, kidney transplant recipients had a greater risk of a more severe course of COVID-19 compared with hemodialysis patients when adjusted for age, sex and comorbidities.


Sign in / Sign up

Export Citation Format

Share Document