scholarly journals Modern UNOS Data Reveals Septuagenarians have Inferior Heart Transplant Survival

Author(s):  
Manish Suryapalam ◽  
Jay Kanaparthi ◽  
Mohammed Abul Kashem ◽  
Huaqing Zhao ◽  
Yoshiya Toyoda

Background: While heart transplantation is increasingly performed in the United States for elderly patients, survival outcomes have primarily been analyzed in single-center studies. The few existing long-term studies have indicated no difference in HTx outcomes between patients ≥70 years and 60-69 years age, but these studies only assessed to 5-years post-transplant and included data from the 1980-90s, introducing significant variance due to poorer outcomes in that era. We analyzed the UNOS database from 1987-2020, stratified by timeframe at 2000, to derive a more representative comparison of modern HTx survival outcomes. Methods: All UNOS HTx recipients over 18 years of age (n=66,186) were divided into 3 cohorts: 18-59, 60-69 and ≥70 years old. Demographic data as well as perioperative factors were evaluated for significance using Chi-Squared and H-Tests as appropriate. Kaplan-Meier Curve and cox regressions with log-rank tests were used to assess 5 through 10 year survival outcomes. Results: 45,748 were 18-59 years old, 19,129 were 60-69 years old and 1,309 were ≥70 year old. The distribution of most demographic and perioperative factors significantly differed between cohorts. Pairwise survival analysis involving the 18-59 cohort always indicated significance. While there was no significance between the two older cohorts in the earlier timeframe, there was significance in the later timeframe from 6-10 years post-HTx (p<0.05). Cox regressions confirmed results. Conclusions: The results indicate that since 2000, recipients 60-69 years of age have better 6 through 10-year post-transplant survival than older recipients, a relationship previously obscured by worse outcomes in early data.

Circulation ◽  
2020 ◽  
Vol 142 (Suppl_3) ◽  
Author(s):  
Jay Kanaparthi ◽  
Mohammed Kashem ◽  
Manish Suryapalam ◽  
Yoshiya Toyoda

Introduction: As the prevalence of heart failure increases by age, it is critical we understand the role of heart transplantation (HTx) in older patients. Recent long term studies have indicated no difference in HTx outcomes between recipients 70 years or older and those ages 60-69. But these studies included data from the 1980-90s, introducing significant variance due poorer outcomes across age groups in that era. We analyzed the most recent United Network for Organ Sharing (UNOS) database, stratified by time frames before and after 2000, to demonstrate this statistical discrepancy and derive a more representative comparison of modern survival by age group. Hypothesis: HTx recipients 70+ years old may not actually have comparable survival to those 60-69 years of age, when assessing more recent HTx data. Methods: All UNOS HTx recipients over 60 years of age (n=20,446) were divided into 2 cohorts: those 60-69 and those ≥70 years old, which were analyzed over two time frames: transplant date 1987-1999 and 2000-2019. Demographic data (gender, ethnicity, BMI) as well as peri-operative factors (ICU stay, ischemic time, and length of stay) were evaluated for significance using Chi-Squared and H-Tests as appropriate. Kaplan-Meier Curve with log-rank tests were used to assess 10 year survival outcomes. Results: 19,129 patients were 60-69 years old, and 1,317 were ≥70 years old, with mean ages of 64.0±2.7 and 71.3±1.6 years respectively. The distribution of demographic and peri-operative factors was significantly different between the cohorts, with p<0.05 for values. Survival analysis indicated no significance in the earlier timeframe (1987-1999) with p=0.341, but indicated significance in the later timeframe (2000-2019), with p=0.004. Conclusion: The results indicate that since 2000, recipients 60-69 years of age have better 10- year post transplant survival than older recipients, a relationship previously obscured by worse outcomes in early data.


2019 ◽  
Vol 29 (4) ◽  
pp. 354-360 ◽  
Author(s):  
S. Ali Husain ◽  
Kristen L. King ◽  
Geoffrey K. Dube ◽  
Demetra Tsapepas ◽  
David J. Cohen ◽  
...  

Introduction: The Kidney Allocation System in the United States prioritizes candidates with Estimated Post-Transplant Survival (EPTS) ≤20% to receive deceased donor kidneys with Kidney Donor Profile Index (KDPI) ≤20%. Research Question: We compared access to KDPI ≤ 20% kidneys for EPTS ≤ 20% candidates across the United States to determine whether geographic disparities in access to these low KDPI kidneys exist. Design: We identified all incident adult deceased donor kidney candidates wait-listed January 1, 2015, to March 31, 2018, using United Network for Organ Sharing data. We calculated the proportion of candidates transplanted, final EPTS, and KDPI of transplanted kidneys for candidates listed with EPTS ≤ 20% versus >20%. We compared the odds of receiving a KDPI ≤ 20% deceased donor kidney for EPTS ≤ 20% candidates across regions using logistic regression. Results: Among 121 069 deceased donor kidney candidates, 28.5% had listing EPTS ≤ 20%. Of these, 16.1% received deceased donor kidney transplants (candidates listed EPTS > 20%: 17.1% transplanted) and 12.3% lost EPTS ≤ 20% status. Only 49.4% of transplanted EPTS ≤ 20% candidates received a KDPI ≤ 20% kidney, and 48.3% of KDPI ≤ 20% kidneys went to recipients with EPTS > 20% at the time of transplantation. Odds of receiving a KDPI ≤ 20% kidney were highest in region 6 and lowest in region 9 (odds ratio 0.19 [0.13 to 0.28]). The ratio of KDPI ≤ 20% donors per EPTS ≤ 20% candidate and likelihood of KDPI ≤ 20% transplantation were strongly correlated ( r 2 = 0.84). Discussion: Marked geographic variation in the likelihood of receiving a KDPI ≤ 20% deceased donor kidney among transplanted EPTS ≤ 20% candidates exists and is related to differences in organ availability within allocation borders. Policy changes to improve organ sharing are needed to improve equity in access to low KDPI kidneys.


2020 ◽  
Vol 2020 ◽  
pp. 1-6
Author(s):  
Fatmah N. AlMotawah ◽  
Sharat Chandra Pani ◽  
Tala AlKharashi ◽  
Saleh AlKhalaf ◽  
Mohammed AlKhathlan ◽  
...  

Aim. This study aimed to retrospectively compare the survival outcomes over two years between teeth with proximal dental caries that were restored with stainless-steel crowns to those that were pulpotomized and then restored with a stainless-steel crown in patients who were rehabilitated under general anesthesia. Participants and Methods. The records of 131 patients aged between two to six years who had stainless-steel crowns placed under general anesthesia and had two-year follow-up were screened. 340 teeth with moderate proximal caries on the radiograph (D2) were included in the study. Of these, 164 teeth were treated with a pulpotomy and stainless-steel crown, while 176 teeth were crowned without a pulpotomy. The type of each tooth was compared using the Chi-squared test and Kaplan–Meier survival analysis, and curves were plotted based on the two-year outcomes. Results. Treatment: the sample comprised 59 males (mean age 4.73 years, SD ± 1.4 years) and 72 females (mean age 5.2 years, SD ± 2.0 years). The Kaplan–Meier regression model showed no significant difference in survival outcomes between teeth that had been pulpotomized and those that had not ( p  = 0.283). Conclusion. Within the limitations of the current study, we can conclude that performing a pulpotomy does not influence the survival outcome of mild/moderate proximal caries restored with stainless-steel crowns under general anesthesia.


Forests ◽  
2019 ◽  
Vol 10 (3) ◽  
pp. 271 ◽  
Author(s):  
Susan Jones-Held ◽  
Michael Held ◽  
Joe Winstead ◽  
William Bryant

Wind disturbance is an important factor that can affect the development of the forests of the Central Hardwood Region of the United States. However, there have been few long-term studies of the recovery of these systems following wind damage. Long-term studies of protected forest systems, such as Dinsmore Woods in Northern Kentucky, within the fragmented forest of this region are valuable as they provide a resource to document and understand the effect of both abiotic and biotic challenges to forest systems. This study is a 40-year analysis of both overstory and understory changes in the forest system at Dinsmore Woods as the result of damage caused by severe winds in the spring of 1974. The forest was surveyed before and immediately following the windstorm and then at 10-year intervals. Although the windstorm had an immediate effect on the forest, the pattern of damage was complex. The forest canopy (diameter at breast height (DBH) ≥ 30 cm) experienced an irregular pattern of damage while in the subcanopy (DBH ≤ 30 cm) there was a 25% reduction in total basal area. However, the major effects of the windstorm were delayed and subsequently have altered forest recovery. Ten years following the disturbance declines were seen in total density and basal area in the canopy and subcanopy of the forest as a consequence of windstorm damage. In the past 20 years the total basal area of the canopy has increased and exceeds the pre-disturbance total basal area. In contrast, the subcanopy total basal area continued to decline 20 years post-disturbance and has not recovered. Further openings in the canopy and subcanopy due to the delayed windstorm effects helped to establish a dense understory of native shrubs and sugar maple which have affected tree regeneration and is reflected in the continual decline in species diversity in the subcanopy and sapling strata over the 40-year period.


Blood ◽  
2010 ◽  
Vol 116 (21) ◽  
pp. 3453-3453 ◽  
Author(s):  
Stuart L Goldberg ◽  
Marc Elmann ◽  
Mark Kaminetzky ◽  
Eriene-Heidi Sidhom ◽  
Anthony R Mato ◽  
...  

Abstract Abstract 3453 Individuals undergoing allogeneic transplantation receive multiple red blood cell transfusions both as part of the transplant procedure and as part of the pre-transplant care of the underlying disease. Therefore these patients may be at risk for complications of transfusional iron overload. Several studies have noted that individuals entering the transplant with baseline elevated serum ferritin values have decreased overall survival and higher rates of disease relapse. Whether the iron is a direct contributor to inferior outcomes or is a marker of more advanced disease (thereby requiring greater transfusions) is unclear. Little is known about the incidence and consequences of iron overload among long-term survivors of allogeneic transplantation. Methods: Using Kaplan-Meier and Cox regression analyses, we performed a single center, retrospective cohort study of consecutive allogeneic transplants performed at Hackensack University Medical Center from January 2002 through June 30, 2009 to determine the association between serum ferritin (measured approximately 1 yr post allogeneic transplant) and overall survival. Results: During the study time frame, 637 allogeneic transplants (Donor Lymphocyte Infusion procedures excluded) were performed at our center and 342 (54%) survived ≥ one year. Among 1-year survivors 240 (70%) had post-transplant serum ferritin values available for review, including 132 (55%) allogeneic sibling, 68 (28%) matched unrelated, and 40 (17%) mismatched unrelated donor transplants. The median post-transplant ferritin value among 1-year survivors of allogeneic transplant was 628 ng/ml (95% CI 17, 5010), with 93 (39%) above 1000 ng/ml and 40 (17%) above 2500 ng/ml. The median post-transplant ferritin levels varied by underlying hematologic disease (aplastic anemia = 1147, acute leukemia = 1067, MDS = 944, CLL = 297, CML = 219, lymphoma = 123, multiple myeloma = 90). The Kaplan-Meier projected 5-year survival rate was 76% for the cohort that had survived one year and had available ferritin values. Fifty late deaths have occurred; causes of late death were disease relapse (n=37, 74%), GVHD (n=7, 14%), infection (n=4, 8%), cardiac (n=1, 2%) and second malignancy (n=1, 2%). The 1-year post-transplant serum ferritin value was a significant predictor of long term survival. Using a cut-off ferritin value of 1000 ng/ml, the 5-year projected survivals were 85% (95 CI 75%-91%) and 64% (95% CI 52–73%) for the low and high ferritin cohorts respectively (Figure, log-rank p<0.001), with a hazard ratio of 3.5 (95% CI 2–6.4, p<0.001). Similarly a serum ferritin value >2500 ng/ml was associated with inferior survival (HR 2.97, p<0.001). Underlying hematologic disease also correlated with 5-year projected survival including 70%, 83%, and 89% for acute leukemia/MDS, lymphoma/myeloma/CLL, and aplastic anemia/CML groupings, respectively (log-rank p<0.01 for leukemia/MDS vs other groupings). Patients receiving bone marrow grafts did better than those receiving peripheral blood stem cells (HR = 2.2; p = 0.03). Age, gender, donor type (sibling, matched unrelated, mismatch unrelated) and intensity of regimen (ablative vs. non-myeloablative) were not predictive of inferior survival in univariate analysis. In the multivariate Cox-regression analysis, elevated post-transplant ferritin >1000 ng/ml (HR 3.3, 95%CI 1.6–6.1; p<0.001) and diagnosis of acute leukemia/MDS (HR 4.5, 95%CI 1.1–18.7; p=0.04) remained independent predictors of inferior survival, even when adjusted for age, gender, type of graft, donor type, and intensity of conditioning regimen. Relapse deaths (25% vs. 9%; p<0.001) and GVHD deaths (6% vs 0.6%; p=0.03) were more common in the high ferritin cohort. Conclusions: Among patients who have survived one-year following allogeneic transplantation, a post-transplant serum ferritin value greater than 1000 ng/ml is a predictor of inferior long-term outcomes. To our knowledge this is the first report on the importance of late monitoring of serum ferritin, but it is in agreement with prior studies suggesting a pre-transplant ferritin value is a predictor of outcomes. Prospective studies attempting to modify outcomes by reducing post-transplant iron overload states are needed. Disclosures: No relevant conflicts of interest to declare.


2021 ◽  
Vol 8 ◽  
Author(s):  
Sydne Record ◽  
Nicole M. Voelker ◽  
Phoebe L. Zarnetske ◽  
Nathan I. Wisnoski ◽  
Jonathan D. Tonkin ◽  
...  

Global loss of biodiversity and its associated ecosystem services is occurring at an alarming rate and is predicted to accelerate in the future. Metacommunity theory provides a framework to investigate multi-scale processes that drive change in biodiversity across space and time. Short-term ecological studies across space have progressed our understanding of biodiversity through a metacommunity lens, however, such snapshots in time have been limited in their ability to explain which processes, at which scales, generate observed spatial patterns. Temporal dynamics of metacommunities have been understudied, and large gaps in theory and empirical data have hindered progress in our understanding of underlying metacommunity processes that give rise to biodiversity patterns. Fortunately, we are at an important point in the history of ecology, where long-term studies with cross-scale spatial replication provide a means to gain a deeper understanding of the multiscale processes driving biodiversity patterns in time and space to inform metacommunity theory. The maturation of coordinated research and observation networks, such as the United States Long Term Ecological Research (LTER) program, provides an opportunity to advance explanation and prediction of biodiversity change with observational and experimental data at spatial and temporal scales greater than any single research group could accomplish. Synthesis of LTER network community datasets illustrates that long-term studies with spatial replication present an under-utilized resource for advancing spatio-temporal metacommunity research. We identify challenges towards synthesizing these data and present recommendations for addressing these challenges. We conclude with insights about how future monitoring efforts by coordinated research and observation networks could further the development of metacommunity theory and its applications aimed at improving conservation efforts.


2019 ◽  
Vol 29 (3) ◽  
pp. 213-219
Author(s):  
Danielle Brandman ◽  
Hollis Lin ◽  
Anastasia McManus ◽  
Sonalee Agarwal ◽  
Larry M. Gache ◽  
...  

Introduction: Orthotopic liver transplantation has been used as a treatment for hereditary transthyretin-mediated (hATTR) amyloidosis, a rare, progressive, and multisystem disease. Research Question: The objective is to evaluate survival outcomes post-liver transplantation in patients with hATTR amyloidosis in the United States and assess whether previously published prognostic factors of patient survival in hATTR amyloidosis are generalizable to the US population. Design: This cohort study examined patients with hATTR amyloidosis undergoing liver transplant in the United States (N = 168) between March 2002 and March 2016 using data reported to the Organ Procurement and Transplantation Network (UNOS)/United Network for Organ Sharing (OPTN). Results: A multivariable Cox hazards regression model showed among all factors tested, only modified body mass index (kg/m2 × g/L) at the time of transplant was significantly associated with survival. Higher modified BMI was associated with lower risk of death relative to a reference population (<600) with historically poor post-transplant outcomes. Patients with modified BMI 1000 to <1200 (hazard ratio [HR] = 0.27; 95% confidence interval [CI] = 0.10-0.73), 1200 to <1400 (HR = 0.20; 95% CI = 0.06-0.75), and ≥1400 (HR = 0.15; 95% CI = 0.04-0.61) exhibited improved adjusted 5-year post-transplant survival of 74%, 80%, and 85%, respectively, versus 33% in the reference population. Discussion: The association between a higher modified BMI threshold at the time of transplant and improved post-transplant survival suggests that the previously published patient selection criterion for modified BMI may not be applicable to the US population.


Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Ziad Taimeh ◽  
Kairav Vakil ◽  
Cindy Martin ◽  
Renuka Jain ◽  
Monica Colvin

Introduction and hypothesis: Genetic cardiomyopathies (GNCM) are a spectrum of myocardial disorders that can lead to heart failure, and frequently portend the need for heart transplantation. Post-transplant outcomes in this subgroup of patients have not been examined in a large, multicenter transplant cohort. Methods: Patients who underwent first-time heart transplantation in the United States between 1987 and 2012 were retrospectively identified from the United Network for Organ Sharing database. Patients with hypertrophic cardiomyopathy (HOCM), arrhythmogenic right ventricular cardiomyopathy (ARVC), and left ventricular non-compaction (LVNC) constituted the GNCM group. Primary outcome was survival. Secondary outcomes included rejection, cardiac allograft vasculopathy (CAV), and graft failure. Results: Of the 49,417 transplant recipients identified, 997 recipients (mean age 36±20 years; 55% males; 79% Caucasian) had GNCM (HOCM n=836; ARVC n=83; LVNC n=78). Patients transplanted for GNCM had significantly higher 1, 5 and 10 year survival rates compared to those without GNCM (86%, 76%, 66% vs. 82%, 69%, 50%, respectively, log-rank p<0.001) (Figure 1A). After adjusting for age, sex, and race in multivariate Cox regression analysis; GNCM was associated with favorable post-transplant survival, with a hazard ratio of 0.70 (95% confidence interval 0.58-0.86; p=0.001). While the incidence of rejection was similar in GNCM compared to non-GNCM (43% vs. 40%, p=0.11), the incidences of CAV and graft failure were significantly lower compared to non-GNCM (24% vs 32%, p<0.001, and 9% vs 15%, p<0.001, respectively). The survival rates for HOCM, ARVC, and LVNC, were all similar to each other but significantly higher compared to non-GNCM (log-rank p<0.001) (Figure 1B). Conclusions: Patients with GNCM seem to have better post-transplant survival and graft outcomes than patients transplanted for other cardiomyopathies.


2013 ◽  
Vol 2013 ◽  
pp. 1-9 ◽  
Author(s):  
Karima Boubaker ◽  
Tahar Gargah ◽  
Ezzedine Abderrahim ◽  
Taieb Ben Abdallah ◽  
Adel Kheder

Introduction and Aims. Post-transplant tuberculosis (TB) is a problem in successful long-term outcome of renal transplantation recipients. Our objective was to describe the pattern and risk factors of TB infection and the prognosis in our transplant recipients.Patients and Methods. This study was a retrospective review of the records of 491 renal transplant recipients in our hospital during the period from January 1986 to December 2009. The demographic data, transplant characteristics, clinical manifestations, diagnostic criteria, treatment protocol, and long-term outcome of this cohort of patients were analyzed.Results. 16 patients (3,2%) developed post-transplant TB with a mean age of 32,5 ± 12,7 (range: 13–60) years and a mean post-transplant period of 36,6months (range: 12,3 months–15,9 years). The forms of the diseases were pulmonary in 10/16 (62,6%), disseminated in 3/16 (18,7%), and extrapulmonary in 3/16 (18,7%). Graft dysfunction was observed in 7 cases (43,7%) with tissue-proof acute rejection in 3 cases and loss of the graft in 4 cases. Hepatotoxicity developed in 3 patients (18,7%) during treatment. Recurrences were observed in 4 cases after early stop of treatment. Two patients (12.5%) died.Conclusion. Extra pulmonary and disseminated tuberculosis were observed in third of our patients. More than 9months of treatment may be necessary to prevent recurrence.


Sign in / Sign up

Export Citation Format

Share Document