scholarly journals Demographics and Clinical Features of Postresuscitation Comorbidities in Long-Term Survivors of Out-of-Hospital Cardiac Arrest: A National Follow-Up Study

2017 ◽  
Vol 2017 ◽  
pp. 1-9 ◽  
Author(s):  
Chih-Pei Su ◽  
Jr-Hau Wu ◽  
Mei-Chueh Yang ◽  
Ching-Hui Liao ◽  
Hsiu-Ying Hsu ◽  
...  

The outcome of patients suffering from out-of-hospital cardiac arrest (OHCA) is very poor, and postresuscitation comorbidities increase long-term mortality. This study aims to analyze new-onset postresuscitation comorbidities in patients who survived from OHCA for over one year. The Taiwan National Health Insurance (NHI) Database was used in this study. Study and comparison groups were created to analyze the risk of suffering from new-onset postresuscitation comorbidities from 2011 to 2012 (until December 31, 2013). The study group included 1,346 long-term OHCA survivors; the comparison group consisted of 4,038 matched non-OHCA patients. Demographics, patient characteristics, and risk of suffering comorbidities (using Cox proportional hazards models) were analyzed. We found that urinary tract infections (n=225, 16.72%), pneumonia (n=206, 15.30%), septicemia (n=184, 13.67%), heart failure (n=111, 8.25%) gastrointestinal hemorrhage (n=108, 8.02%), epilepsy or recurrent seizures (n=98, 7.28%), and chronic kidney disease (n=62, 4.61%) were the most common comorbidities. Furthermore, OHCA survivors were at much higher risk (than comparison patients) of experiencing epilepsy or recurrent seizures (HR = 20.83; 95% CI: 12.24–35.43), septicemia (HR = 8.98; 95% CI: 6.84–11.79), pneumonia (HR = 5.82; 95% CI: 4.66–7.26), and heart failure (HR = 4.88; 95% CI: 3.65–6.53). Most importantly, most comorbidities occurred within the first half year after OHCA.

Author(s):  
Daniel B Ibsen ◽  
Emily B Levitan ◽  
Agneta Åkesson ◽  
Bruna Gigante ◽  
Alicja Wolk

Abstract Aims Trials demonstrate that following the DASH diet lowers blood pressure, which may prevent development of heart failure (HF). We investigated the association between long-term adherence to the DASH diet and food substitutions within the DASH diet on the risk of HF. Methods Men and women aged 45-83 years without previous HF, ischemic heart disease or cancer at baseline in 1998 from the Cohort of Swedish Men (n = 41,118) and the Swedish Mammography Cohort (n = 35,004) were studied. The DASH diet emphasizes intake of fruit, vegetables, whole grains, nuts and legumes and low-fat dairy and deemphasizes red and processed meat, sugar-sweetened beverages and sodium. DASH diet scores were calculated based on diet assessed by food frequency questionnaires in late 1997 and 2009. Incidence of HF was ascertained using the Swedish Patient Register. Multivariable Cox proportional hazards models were used to estimate hazard ratios (HR) with 95% confidence intervals (CI). Results During the median 22 years of follow-up (1998-2019) 12,164 participants developed HF. Those with the greatest adherence to the DASH diet had a lower risk of HF compared to those with the lowest adherence (HR 0.85, 95% CI 0.80, 0.91 for baseline diet and HR 0.83, 95% CI 0.78, 0.89 for long-term diet, comparing quintiles). Replacing 1 serving/day of red and processed meat with emphasized DASH diet foods was associated with an 8-12% lower risk of HF. Conclusion Long-term adherence to the DASH diet and relevant food substitutions within the DASH diet were associated with a lower risk of HF.


Circulation ◽  
2007 ◽  
Vol 116 (suppl_16) ◽  
Author(s):  
Toru Aoyama ◽  
Hideki Ishii ◽  
Hiroshi Takahashi ◽  
Takanobu Toriyama ◽  
Toru Aoyama ◽  
...  

Background: The cardiovascular (CV) events and mortality are significantly higher in hemodialysis (HD) patents compared to the general population. Although it is of clinical concern to predict the occurrence of CV events in long-term HD patients, more powerful predictor has under exploration. We investigated as to whether silent brain infarction (SBI) would be a predictable factor for future CV events and mortality in a large cohort of patients with long-term HD patients. Methods: After cranial magnetic resonance imaging to detect SBI, 202 long-term HD patients (7.1 ± 5.9 years) without symptomatic stroke were prospectively followed up until the incident of CV events (stroke, cardiac events, and death). We analyzed the prognostic role of SBI in CV events with the Kaplan-Meier method and Cox proportional hazards analysis. Results: The prevalence of SBI was quite higher compared to the previous reports (71.8% in all the patients). In overall patients, 60 patients suffered from CV disease (31 for coronary artery disease, 7 for congestive heart failure, 14 for symptomatic stroke) and 29 patients died (16 for CV death) during a follow up period (mean= 23 ± 13 months). In subgroup analysis regarding the presence of SBI, CV event-free survival rate for 4 years was significantly lower in the patients with SBI compared to those without SBI (54.6% vs. 86.7%, p=0.0003). CV and overall mortality were also significantly higher in SBI patients compared with No-SBI patients (CV mortality; 20.5 % vs. 4.3 %, overall mortality; 29.0% vs. 9.1% p< 0.01, respectively). Cox proportional hazards models showed that the presence of SBI was a significant predictor of cerebrovascular and CV events and CV and overall mortality even after adjustment for other CV risk factors listed on the Table . Conclusion: SBI detected with MRI would be powerful predictor of CV events and mortality in long-term HD patients. Hazard ratio (HR) of SBI for future events and mortality


2021 ◽  
Vol 8 ◽  
Author(s):  
Pei-Pei Zheng ◽  
Si-Min Yao ◽  
Di Guo ◽  
Ling-ling Cui ◽  
Guo-Bin Miao ◽  
...  

Background: The prevalence and prognostic value of heart failure (HF) stages among elderly hospitalized patients is unclear.Methods: We conducted a prospective, observational, multi-center, cohort study, including hospitalized patients with the sample size of 1,068; patients were age 65 years or more, able to cooperate with the assessment and to complete the echocardiogram. Two cardiologists classified all participants in various HF stages according to 2013 ACC/AHA HF staging guidelines. The outcome was rate of 1-year major adverse cardiovascular events (MACE). The Kaplan–Meier method and Cox proportional hazards models were used for survival analyses. Survival classification and regression tree analysis were used to determine the optimal cutoff of N-terminal pro-brain natriuretic peptide (NT-proBNP) to predict MACE.Results: Participants' mean age was 75.3 ± 6.88 years. Of them, 4.7% were healthy and without HF risk factors, 21.0% were stage A, 58.7% were stage B, and 15.6% were stage C/D. HF stages were associated with worsening 1-year survival without MACE (log-rank χ2 = 69.62, P &lt; 0.001). Deterioration from stage B to C/D was related to significant increases in HR (3.636, 95% CI, 2.174–6.098, P &lt; 0.001). Patients with NT-proBNP levels over 280.45 pg/mL in stage B (HR 2; 95% CI 1.112–3.597; P = 0.021) and 11,111.5 pg/ml in stage C/D (HR 2.603, 95% CI 1.014–6.682; P = 0.047) experienced a high incidence of MACE adjusted for age, sex, and glomerular filtration rate.Conclusions : HF stage B, rather than stage A, was most common in elderly inpatients. NT-proBNP may help predict MACE in stage B.Trial Registration: ChiCTR1800017204; 07/18/2018.


Author(s):  
David A. Baran ◽  
Justin Lansinger ◽  
Ashleigh Long ◽  
John M. Herre ◽  
Amin Yehya ◽  
...  

Background: The opioid crisis has led to an increase in available donor hearts, although questions remain about the long-term outcomes associated with the use of these organs. Prior studies have relied on historical information without examining the toxicology results at the time of organ offer. The objectives of this study were to examine the long-term survival of heart transplants in the recent era, stratified by results of toxicological testing at the time of organ offer as well as comparing the toxicology at the time of donation with variables based on reported history. Methods: The United Network for Organ Sharing database was requested as well as the donor toxicology field. Between 2007 and 2017, 23 748 adult heart transplants were performed. United Network for Organ Sharing historical variables formed a United Network for Organ Sharing Toxicology Score and the measured toxicology results formed a Measured Toxicology Score. Survival was examined by the United Network for Organ Sharing Toxicology Score and Measured Toxicology Score, as well as Cox proportional hazards models incorporating a variety of risk factors. Results: The number and percent of donors with drug use has significantly increased over the study period ( P <0.0001). Cox proportional hazards modeling of survival including toxicological and historical data did not demonstrate differences in post-transplant mortality. Combinations of drugs identified by toxicology were not associated with differences in survival. Lower donor age and ischemic time were significantly positively associated with survival ( P <0.0001). Conclusions: Among donors accepted for transplantation, neither history nor toxicological evidence of drug use was associated with significant differences in survival. Increasing use of such donors may help alleviate the chronic donor shortage.


2021 ◽  
Vol 39 (15_suppl) ◽  
pp. 4578-4578
Author(s):  
Bradley Alexander McGregor ◽  
Daniel M. Geynisman ◽  
Mauricio Burotto ◽  
Camillo Porta ◽  
Cristina Suarez Rodriguez ◽  
...  

4578 Background: Nivolumab in combination with cabozantinib (N+C) has demonstrated significantly improved progression-free survival (PFS), objective response rate (ORR), and overall survival (OS), compared with sunitinib as a first-line (1L) treatment for aRCC in the phase 3 CheckMate (CM) 9ER trial. As there are no head-to-head trials comparing N+C with pembrolizumab in combination with axitinib (P+A), this study compared the efficacy of N+C with P+A as 1L treatment in aRCC. Methods: An MAIC was conducted using individual patient data on N+C (N = 323) from the CM 9ER trial (median follow-up: 23.5 months) and published data on P+A (N = 432) from the KEYNOTE (KN)-426 trialof P+A (median follow-up: 30.6 months). Individual patients within the CM 9ER trial population were reweighted to match the key patient characteristics published in KN-426 trial, including age, gender, previous nephrectomy, International Metastatic RCC Database Consortium risk score, and sites of metastasis. After weighting, hazards ratios (HR) of PFS, duration of response (DoR), and OS comparing N+C vs. P+A were estimated using weighted Cox proportional hazards models, and ORR was compared using a weighted Wald test. All comparisons were conducted using the corresponding sunitinib arms as an anchor. Results: After weighting, patient characteristics in the CM 9ER trial were comparable to those in the KN-426 trial. In the weighted population, N+C had a median PFS of 19.3 months (95% CI: 15.2, 22.4) compared to a median PFS of 15.7 months (95% CI: 13.7, 20.6) for P+A. Using sunitinib as an anchor arm, N+C was associated with a 30% reduction in risk of progression or death compared to P+A, (HR: 0.70, 95% CI: 0.53, 0.93; P = 0.015; table). In addition, N+C was associated with numerically, although not statistically, higher improvement in ORR vs sunitinib (difference: 8.4%, 95% CI: -1.7%, 18.4%; P = 0.105) and improved DoR (HR: 0.79; 95% CI: 0.47, 1.31; P = 0.359). Similar OS outcomes were observed for N+C and P+A (HR: 0.99; 95% CI: 0.67, 1.44; P = 0.940). Conclusions: After adjusting for cross-trial differences, N+C had a more favorable efficacy profile compared to P+A, including statistically significant PFS benefits, numerically improved ORR and DoR, and similar OS.[Table: see text]


2019 ◽  
Vol 35 (3) ◽  
pp. 488-495 ◽  
Author(s):  
Thijs T Jansz ◽  
Marlies Noordzij ◽  
Anneke Kramer ◽  
Eric Laruelle ◽  
Cécile Couchoud ◽  
...  

Abstract Background Previous US studies have indicated that haemodialysis with ≥6-h sessions [extended-hours haemodialysis (EHD)] may improve patient survival. However, patient characteristics and treatment practices vary between the USA and Europe. We therefore investigated the effect of EHD three times weekly on survival compared with conventional haemodialysis (CHD) among European patients. Methods We included patients who were treated with haemodialysis between 2010 and 2017 from eight countries providing data to the European Renal Association–European Dialysis and Transplant Association Registry. Haemodialysis session duration and frequency were recorded once every year or at every change of haemodialysis prescription and were categorized into three groups: CHD (three times weekly, 3.5–4 h/treatment), EHD (three times weekly, ≥6 h/treatment) or other. In the primary analyses we attributed death to the treatment at the time of death and in secondary analyses to EHD if ever initiated. We compared mortality risk for EHD to CHD with causal inference from marginal structural models, using Cox proportional hazards models weighted for the inverse probability of treatment and censoring and adjusted for potential confounders. Results From a total of 142 460 patients, 1338 patients were ever treated with EHD (three times, 7.1 ± 0.8 h/week) and 89 819 patients were treated exclusively with CHD (three times, 3.9 ± 0.2 h/week). Crude mortality rates were 6.0 and 13.5/100 person-years. In the primary analyses, patients treated with EHD had an adjusted hazard ratio (HR) of 0.73 [95% confidence interval (CI) 0.62–0.85] compared with patients treated with CHD. When we attributed all deaths to EHD after initiation, the HR for EHD was comparable to the primary analyses [HR 0.80 (95% CI 0.71–0.90)]. Conclusions EHD is associated with better survival in European patients treated with haemodialysis three times weekly.


2011 ◽  
Vol 29 (12) ◽  
pp. 1599-1606 ◽  
Author(s):  
Kimmie Ng ◽  
Daniel J. Sargent ◽  
Richard M. Goldberg ◽  
Jeffrey A. Meyerhardt ◽  
Erin M. Green ◽  
...  

Purpose Previous studies have suggested that higher plasma 25-hydroxyvitamin D3 [25(OH)D] levels are associated with decreased colorectal cancer risk and improved survival, but the prevalence of vitamin D deficiency in advanced colorectal cancer and its influence on outcomes are unknown. Patients and Methods We prospectively measured plasma 25(OH)D levels in 515 patients with stage IV colorectal cancer participating in a randomized trial of chemotherapy. Vitamin D deficiency was defined as 25(OH)D lower than 20 ng/mL, insufficiency as 20 to 29 ng/mL, and sufficiency as ≥ 30 ng/mL. We examined the association between baseline 25(OH)D level and selected patient characteristics. Cox proportional hazards models were used to calculate hazard ratios (HR) for death, disease progression, and tumor response, adjusted for prognostic factors. Results Among 515 eligible patients, 50% of the study population was vitamin D deficient, and 82% were vitamin D insufficient. Plasma 25(OH)D levels were lower in black patients compared to white patients and patients of other race (median, 10.7 v 21.1 v 19.3 ng/mL, respectively; P < .001), and females compared to males (median, 18.3 v 21.7 ng/mL, respectively; P = .0005). Baseline plasma 25(OH)D levels were not associated with patient outcome, although given the distribution of plasma levels in this cohort, statistical power for survival analyses were limited. Conclusion Vitamin D deficiency is highly prevalent among patients with stage IV colorectal cancer receiving first-line chemotherapy, particularly in black and female patients.


2021 ◽  
Vol 5 (Supplement_2) ◽  
pp. 1110-1110
Author(s):  
Dong Zhen ◽  
John Jr Richie ◽  
Xiang Gao ◽  
Biyi Shen ◽  
David Orentreich

Abstract Objectives Increasing evidence in animal models and humans suggests that diets high in sulfur-containing amino acids (SAA) could be associated a greater risk for type 2 diabetes (T2D). However, data from longitudinal human studies linking dietary SAA intake with T2D is lacking. The present study aimed to examine the association between long-term dietary intake of SAA including total SAAs, methionine, and cysteine and incident T2D in participants of the Framingham Heart Study (FHS). Methods Adult participants were selected from two prospective FHS cohorts: The Offspring Cohort (followed from 1991 to 2015, n = 3799) and the Third Generation Cohort (followed from 2002 to 2011, n = 4096). Individuals identified as diabetes patients before baseline, having missing diet or covariates data, or reported extreme daily energy intake were excluded. Energy-adjusted intake of dietary SAAs was calculated from responses to a 131-item food frequency questionnaire. Cox proportional hazards models were used to evaluate associations between intakes of SAAs (in quintiles) and risk of T2D in each cohort. A combined analysis was also performed pooling subjects from both cohorts. Results Overall, we documented 471 T2D events during 9–23 years of follow-up. In both cohorts, higher SAA intake was associated with a higher risk of T2D after adjustment for demographics, traditional risk factors and related nutrients. Comparing participants in the highest quintile with those in the lowest quintile of intake, adjusted hazard ratios (95% CI) were 1.98 (1.15–3.41) for total intake (P for trend = 0.04) in the Offspring cohort, and 4.37 (1.40–13.67) (P for trend = 0.01) in the Third Generation cohort. In the combination analysis of two cohorts, adjusted hazard ratios (95% CI) were 1.98 (1.23–3.21) for total intake, 2.21 (1.38–3.53) for methionine, and 1.79 (1.12–2.87) for cysteine (P for trends &lt; 0.03). Conclusions Higher long-term SAA intake was associated with higher risk for T2D in humans, suggesting that dietary patterns emphasizing low SAA intake are protective against development of T2D. Funding Sources No funding.


2021 ◽  
pp. 1-14 ◽  
Author(s):  
Olga Mitelman ◽  
Hoda Z. Abdel-Hamid ◽  
Barry J. Byrne ◽  
Anne M. Connolly ◽  
Peter Heydemann ◽  
...  

Background: Studies 4658-201/202 (201/202) evaluated treatment effects of eteplirsen over 4 years in patients with Duchenne muscular dystrophy and confirmed exon-51 amenable genetic mutations. Chart review Study 4658-405 (405) further followed these patients while receiving eteplirsen during usual clinical care. Objective: To compare long-term clinical outcomes of eteplirsen-treated patients from Studies 201/202/405 with those of external controls. Methods: Median total follow-up time was approximately 6 years of eteplirsen treatment. Outcomes included loss of ambulation (LOA) and percent-predicted forced vital capacity (FVC%p). Time to LOA was compared between eteplirsen-treated patients and standard of care (SOC) external controls and was measured from eteplirsen initiation in 201/202 or, in the SOC group, from the first study visit. Comparisons were conducted using univariate Kaplan-Meier analyses and log-rank tests, and multivariate Cox proportional hazards models with regression adjustment for baseline characteristics. Annual change in FVC%p was compared between eteplirsen-treated patients and natural history study patients using linear mixed models with repeated measures. Results: Data were included from all 12 patients in Studies 201/202 and the 10 patients with available data from 405. Median age at LOA was 15.16 years. Eteplirsen-treated patients experienced a statistically significant longer median time to LOA by 2.09 years (5.09 vs. 3.00 years, p < 0.01) and significantly attenuated rates of pulmonary decline vs. natural history patients (FVC%p change: –3.3 vs. –6.0 percentage points annually, p < 0.0001). Conclusions: Study 405 highlights the functional benefits of eteplirsen on ambulatory and pulmonary function outcomes up to 7 years of follow-up in comparison to external controls.


Sign in / Sign up

Export Citation Format

Share Document