scholarly journals Mortality Rates do not Differ among Patients Prescribed Various Vitamin d Agents

2015 ◽  
Vol 35 (1) ◽  
pp. 62-69 ◽  
Author(s):  
T. Christopher Bond ◽  
Steve Wilson ◽  
John Moran ◽  
Mahesh Krishnan

BackgroundLimited well-controlled research exists examining the impact of different formulations of oral vitamin D on clinical outcomes in dialysis patients, specifically those on peritoneal dialysis. For this retrospective mortality analysis, we compared mortality rates of patients on 3 of the most commonly prescribed vitamin D agents.MethodsWe examined 2 years (7/1/2008 to 6/30/2010) of oral medication records of peritoneal dialysis patients from a large US dialysis organization. Patients were identified whose physicians prescribed a single form of vitamin D (calcitriol, paricalcitol, or doxercalciferol) for ≥ 90% of all patient-months. We excluded incident patients (< 90 days on dialysis) and patients whose physicians treated < 5 peritoneal dialysis patients at a dialysis facility, and we assessed mortality.ResultsThe analysis inclusion criteria identified 1,707 patients. The subset in this analysis included 12.6% of all prevalent peritoneal dialysis patients and 11.8% of prevalent patient-months. Patients with physicians who predominately prescribed calcitriol had a lower mortality rate: 9.33 (confidence interval (CI) 7.06, 11.60) deaths per 100 patient-years than the doxercalciferol, 12.20 (CI 9.34, 15.06) or paricalcitol, 12.27 (CI 9.27, 15.28) groups. However, these differences were not statistically significant. A Cox proportional hazards model, adjusting for differences in age, vintage, gender, race, body mass index, and comorbidities also showed no significant differences.ConclusionsFor this peritoneal dialysis population, instrumental variable analyses showed no significant difference in mortality in patients taking the most common oral vitamin D formulations (calcitriol, doxercalciferol, paricalcitol).

2014 ◽  
Vol 34 (3) ◽  
pp. 289-298 ◽  
Author(s):  
Jernej Pajek ◽  
Alastair J. Hutchison ◽  
Shiv Bhutani ◽  
Paul E.C. Brenchley ◽  
Helen Hurst ◽  
...  

BackgroundWe performed a review of a large incident peritoneal dialysis cohort to establish the impact of current practice and that of switching to hemodialysis.MethodsPatients starting peritoneal dialysis between 2004 and 2010 were included and clinical data at start of dialysis recorded. Competing risk analysis and Cox proportional hazards model with time-varying covariate (technique failure) were used.ResultsOf 286 patients (median age 57 years) followed for a median of 24.2 months, 76 were transplanted and 102 died. Outcome probabilities at 3 and 5 years respectively were 0.69 and 0.53 for patient survival (or transplantation) and 0.33 and 0.42 for technique failure. Peritonitis caused technique failure in 42%, but ultrafiltration failure accounted only for 6.3%. Davies comorbidity grade, creatinine and obesity (but not residual renal function or age) predicted technique failure. Due to peritonitis deaths, technique failure was an independent predictor of death hazard. When successful switch to hemodialysis (surviving more than 60 days after technique failure) and its timing were analyzed, no adverse impact on survival in adjusted analysis was found. However, hemodialysis via central venous line was associated with an elevated death hazard as compared to staying on peritoneal dialysis, or hemodialysis through a fistula (adjusted analysis hazard ratio 1.97 (1.02 – 3.80)).ConclusionsOnce the patients survive the first 60 days after technique failure, the switch to hemodialysis does not adversely affect patient outcomes. The nature of vascular access has a significant impact on outcome after peritoneal dialysis failure.


2019 ◽  
Vol 40 (Supplement_1) ◽  
Author(s):  
M Fukunaga ◽  
K Hirose ◽  
A Isotani ◽  
T Morinaga ◽  
K Ando

Abstract Background Relationship between atrial fibrillation (AF) and heart failure (HF) is often compared with proverbial question of which came first, the chicken or the egg. Some patients showing AF at the HF admission result in restoration of sinus rhythm (SR) at discharge. It is not well elucidated that the restoration into SR during hospitalization can render the preventive effect for rehospitalization. Purpose To investigate the impact of restoration into SR during hospitalization for readmission rate of the HF patients showing AF. Methods We enrolled consecutive 640 HF patients hospitalized from January 2015 to December 2015. Patients data were retrospectively investigated from medical record. Patients showing atrial fibrillation on admission but unrecognized ever were defined as “incident AF”; patients with AF diagnosed before admission were defined as “prevalent AF”. Primary endpoint was a composite of death from cardiovascular disease or hospitalization for worsening heart failure. Secondary endpoints were death from cardiovascular disease, unplanned hospitalization related to heart failure, and any hospitalization. Results During mean follow up of 19 months, 139 patients (22%) were categorized as incident AF and 145 patients (23%) were categorized as prevalent AF. Among 239 patients showing AF on admission, 44 patients were discharged in SR (39 patients in incident AF and 5 patients in prevalent AF). Among incident AF patients, the primary composite end point occurred in significantly fewer in those who discharged in SR (19% vs. 42% at 1-year; 23% vs. 53% at 2-year follow-up, p=0.005). To compare the risk factors related to readmission due to HF with the cox proportional-hazards model, AF only during hospitalization [Hazard Ratio (HR)=0.37, p<0.01] and prevalent AF (HR=1.67, p=0.04) was significantly associated. There was no significant difference depending on LVEF. Conclusion Newly diagnosed AF with restoration to SR during hospitalization was a good marker to forecast future prognosis.


Crisis ◽  
2018 ◽  
Vol 39 (1) ◽  
pp. 27-36 ◽  
Author(s):  
Kuan-Ying Lee ◽  
Chung-Yi Li ◽  
Kun-Chia Chang ◽  
Tsung-Hsueh Lu ◽  
Ying-Yeh Chen

Abstract. Background: We investigated the age at exposure to parental suicide and the risk of subsequent suicide completion in young people. The impact of parental and offspring sex was also examined. Method: Using a cohort study design, we linked Taiwan's Birth Registry (1978–1997) with Taiwan's Death Registry (1985–2009) and identified 40,249 children who had experienced maternal suicide (n = 14,431), paternal suicide (n = 26,887), or the suicide of both parents (n = 281). Each exposed child was matched to 10 children of the same sex and birth year whose parents were still alive. This yielded a total of 398,081 children for our non-exposed cohort. A Cox proportional hazards model was used to compare the suicide risk of the exposed and non-exposed groups. Results: Compared with the non-exposed group, offspring who were exposed to parental suicide were 3.91 times (95% confidence interval [CI] = 3.10–4.92 more likely to die by suicide after adjusting for baseline characteristics. The risk of suicide seemed to be lower in older male offspring (HR = 3.94, 95% CI = 2.57–6.06), but higher in older female offspring (HR = 5.30, 95% CI = 3.05–9.22). Stratified analyses based on parental sex revealed similar patterns as the combined analysis. Limitations: As only register-­based data were used, we were not able to explore the impact of variables not contained in the data set, such as the role of mental illness. Conclusion: Our findings suggest a prominent elevation in the risk of suicide among offspring who lost their parents to suicide. The risk elevation differed according to the sex of the afflicted offspring as well as to their age at exposure.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
A Funada ◽  
Y Goto ◽  
T Maeda ◽  
H Okada ◽  
M Takamura

Abstract Background/Introduction Shockable rhythm after cardiac arrest is highly expected after early initiation of bystander cardiopulmonary resuscitation (CPR) owing to increased coronary perfusion. However, the relationship between bystander CPR and initial shockable rhythm in patients with out-of-hospital cardiac arrest (OHCA) remains unclear. We hypothesized that chest-compression-only CPR (CC-CPR) before emergency medical service (EMS) arrival has an equivalent effect on the likelihood of initial shockable rhythm to the standard CPR (chest compression plus rescue breathing [S-CPR]). Purpose We aimed to examine the rate of initial shockable rhythm and 1-month outcomes in patients who received bystander CPR after OHCA. Methods The study included 59,688 patients (age, ≥18 years) who received bystander CPR after an OHCA with a presumed cardiac origin witnessed by a layperson in a prospectively recorded Japanese nationwide Utstein-style database from 2013 to 2017. Patients who received public-access defibrillation before arrival of the EMS personnel were excluded. The patients were divided into CC-CPR (n=51,520) and S-CPR (n=8168) groups according to the type of bystander CPR received. The primary end point was initial shockable rhythm recorded by the EMS personnel just after arrival at the site. The secondary end point was the 1-month outcomes (survival and neurologically intact survival) after OHCA. In the statistical analyses, a Cox proportional hazards model was applied to reflect the different bystander CPR durations before/after propensity score (PS) matching. Results The crude rate of the initial shockable rhythm in the CC-CPR group (21.3%, 10,946/51,520) was significantly higher than that in the S-CPR group (17.6%, 1441/8168, p&lt;0.0001) before PS matching. However, no significant difference in the rate of initial shockable rhythm was found between the 2 groups after PS matching (18.3% [1493/8168] vs 17.6% [1441/8168], p=0.30). In the Cox proportional hazards model, CC-CPR was more negatively associated with the initial shockable rhythm before PS matching (unadjusted hazards ratio [HR], 0.97; 95% confidence interval [CI], 0.94–0.99; p=0.012; adjusted HR, 0.92; 95% CI, 0.89–0.94; p&lt;0.0001) than S-CPR. After PS matching, however, no significant difference was found between the 2 groups (adjusted HR of CC-CPR compared with S-CPR, 0.97; 95% CI, 0.94–1.00; p=0.09). No significant differences were found between C-CPR and S-CPR in the 1-month outcomes after PS matching as follows, respectively: survival, 8.5% and 10.1%; adjusted odds ratio, 0.89; 95% CI, 0.79–1.00; p=0.07; cerebral performance category 1 or 2, 5.5% and 6.9%; adjusted odds, 0.86; 95% CI, 0.74–1.00; p=0.052. Conclusions Compared with S-CPR, the CC-CPR before EMS arrival had an equivalent multivariable-adjusted association with the likelihood of initial shockable rhythm in the patients with OHCA due to presumed cardiac causes that was witnessed by a layperson. Funding Acknowledgement Type of funding source: None


Author(s):  
Anwar Santoso ◽  
Yulianto Yulianto ◽  
Hendra Simarmata ◽  
Abhirama Nofandra Putra ◽  
Erlin Listiyaningsih

AbstractMajor adverse cardio-cerebrovascular events (MACCE) in ST-segment elevation myocardial infarction (STEMI) are still high, although there have been advances in pharmacology and interventional procedures. Proprotein convertase subtilisin/Kexin type 9 (PCSK9) is a serine protease regulating lipid metabolism associated with inflammation in acute coronary syndrome. The MACCE is possibly related to polymorphisms in PCSK9. A prospective cohort observational study was designed to confirm the association between polymorphism of E670G and R46L in the PCSK9 gene with MACCE in STEMI. The Cox proportional hazards model and Spearman correlation were utilized in the study. The Genotyping of PCSK9 and ELISA was assayed.Sixty-five of 423 STEMI patients experienced MACCE in 6 months. The E670G polymorphism in PCSK9 was associated with MACCE (hazard ratio = 45.40; 95% confidence interval: 5.30–390.30; p = 0.00). There was a significant difference of PCSK9 plasma levels in patients with previous statin consumption (310 [220–1,220] pg/mL) versus those free of any statins (280 [190–1,520] pg/mL) (p = 0.001).E670G polymorphism of PCSK9 was associated with MACCE in STEMI within a 6-month follow-up. The plasma PCSK9 level was higher in statin users.


Circulation ◽  
2020 ◽  
Vol 142 (Suppl_3) ◽  
Author(s):  
Jian-jun Li ◽  
Yexuan Cao ◽  
Hui-Wen Zhang ◽  
Jing-Lu Jin ◽  
Yan Zhang ◽  
...  

Introduction: The atherogenicity of residual cholesterol (RC) has been underlined by recent guidelines, which was linked to coronary artery disease (CAD), especially for patients with diabetes mellitus (DM). Hypothesis: This study aimed to examine the prognostic value of plasma RC, clinically presented as triglyceride-rich lipoprotein-cholesterol (TRL-C) or remnant-like lipoprotein particles-cholesterol (RLP-C), in CAD patients with different glucose metabolism status. Methods: Fasting plasma TRL-C and RLP-C levels were directly calculated or measured in 4331 patients with CAD. Patients were followed for incident MACEs for up to 8.6 years and categorized according to both glucose metabolism status [DM, pre-DM, normal glycaemia regulation (NGR)] and RC levels. Cox proportional hazards model was used to calculate hazard ratios (HRs) with 95% confidence intervals. Results: During a mean follow-up of 5.1 years, 541 (12.5%) MACEs occurred. The risk for MACEs was significantly higher in patients with elevated RC levels after adjustment for potential confounders. No significant difference in MACEs was observed between pre-DM and NGR groups (p>0.05). When stratified by status of glucose metabolism and RC levels, highest levels of RLP-C, calculated and measured TRL-C were significant and independent predictors of developing MACEs in pre-DM (HR: 2.10, 1.98, 1.92, respectively; all p<0.05) and DM (HR: 2.25, 2.00, 2.16, respectively; all p<0.05). Conclusions: In this large cohort study with long-term follow-up, data firstly demonstrated that higher RC levels were significantly associated with the worse prognosis in DM and pre-DM patients with CAD, suggesting RC might be a target for patients with impaired glucose metabolism.


Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Tomonori Akasaka ◽  
Seiji Hokimoto ◽  
Noriaki Tabata ◽  
Kenji Sakamoto ◽  
Kenichi Tsujita ◽  
...  

Background: Based on 2011 ACCF/AHA/SCAI PCI guideline, it is recommended that PCI should be performed at hospital with onsite cardiac surgery. But, recent data suggests that there is no significant difference in clinical outcomes following primary or elective PCI between hospitals with and without onsite cardiac surgery. The proportion of PCI centers without onsite cardiac surgery comprises approximately more than half of all PCI centers in Japan. We examined the impact of with or without onsite cardiac surgery on clinical outcomes following PCI to ACS. Methods: From Aug 2008 to March 2011, subjects (n=2288) were enrolled from the Kumamoto Intervention Conference Study (KICS), which is a multicenter registry, and enrolling consecutive patients undergoing PCI in 15 centers in Japan. Patients were assigned to two groups treated in hospitals with (n=1954) or without (n=334) onsite cardiac surgery. Clinical events were followed up for 12 months. Primary endpoint was in-hospital death, cardiovascular death, myocardial infarction, and stroke. And we monitored other events those were non-cardiovascular deaths, bleeding complications, revascularizations, and emergent CABG. Results: There was no overall significant difference in primary endpoint between hospitals with and without onsite cardiac surgery (9.6%vs9.5%; P=0.737). There was also no significant difference when events in primary endpoint were considered separately. In other events, only revascularization was more frequently seen in hospitals with onsite cardiac surgery (22.1%vs12.9%; P<0.001). Kaplan-Meier analysis for primary endpoint showed that there was no significant difference between two groups (Log Rank P=0.943). By cox proportional hazards model analysis for primary endpoint, without onsite cardiac surgery was not a predictive factor for primary endpoint (HR 0.969, 95%CI 0.704-1.333; P=0.845). We performed propensity score matching analysis to correct for the disparate patient numbers between two groups, and there was also no significant difference for primary endpoint (6.9% vs 8.0%; P=0.544). Conclusions: There is no significant difference in clinical outcomes following PCI for ACS between hospitals with and without onsite cardiac surgery backup in Japan.


2021 ◽  
Vol 11 ◽  
Author(s):  
Jason C. Sanders ◽  
Donald A. Muller ◽  
Sunil W. Dutta ◽  
Taylor J. Corriher ◽  
Kari L. Ring ◽  
...  

ObjectivesTo investigate the safety and outcomes of elective para-aortic (PA) nodal irradiation utilizing modern treatment techniques for patients with node positive cervical cancer.MethodsPatients with pelvic lymph node positive cervical cancer who received radiation were included. All patients received radiation therapy (RT) to either a traditional pelvic field or an extended field to electively cover the PA nodes. Factors associated with survival were identified using a Cox proportional hazards model, and toxicities between groups were compared with a chi-square test.Results96 patients were identified with a mean follow up of 40 months. The incidence of acute grade ≥ 2 toxicity was 31% in the elective PA nodal RT group and 15% in the pelvic field group (Chi-square p = 0.067. There was no significant difference in rates of grade ≥ 3 acute or late toxicities between the two groups (p&gt;0.05). The KM estimated 5-year OS was not statistically different for those receiving elective PA nodal irradiation compared to a pelvic only field, 54% vs. 73% respectively (log-rank p = 0.11).ConclusionsElective PA nodal RT can safely be delivered utilizing modern planning techniques without a significant increase in severe (grade ≥ 3) acute or late toxicities, at the cost of a possible small increase in non-severe (grade 2) acute toxicities. In this series there was no survival benefit observed with the receipt of elective PA nodal RT, however, this benefit may have been obscured by the higher risk features of this population. While prospective randomized trials utilizing a risk adapted approach to elective PA nodal coverage are the only way to fully evaluate the benefit of elective PA nodal coverage, these trials are unlikely to be performed and instead we must rely on interpretation of results of risk adapted approaches like those used in ongoing clinical trials and retrospective data.


2021 ◽  
Author(s):  
Miguel I. Paredes ◽  
Stephanie Lunn ◽  
Michael Famulare ◽  
Lauren A. Frisbie ◽  
Ian Painter ◽  
...  

Background: The COVID–19 pandemic is now dominated by variant lineages; the resulting impact on disease severity remains unclear. Using a retrospective cohort study, we assessed the risk of hospitalization following infection with nine variants of concern or interest (VOC/VOI). Methods: Our study includes individuals with positive SARS–CoV–2 RT PCR in the Washington Disease Reporting System and with available viral genome data, from December 1, 2020 to July 30, 2021. The main analysis was restricted to cases with specimens collected through sentinel surveillance. Using a Cox proportional hazards model with mixed effects, we estimated hazard ratios (HR) for the risk of hospitalization following infection with a VOC/VOI, adjusting for age, sex, and vaccination status. Findings: Of the 27,814 cases, 23,170 (83.3%) were sequenced through sentinel surveillance, of which 726 (3.1%) were hospitalized due to COVID–19. Higher hospitalization risk was found for infections with Gamma (HR 3.17, 95% CI 2.15–4.67), Beta (HR: 2.97, 95% CI 1.65–5.35), Delta (HR: 2.30, 95% CI 1.69–3.15), and Alpha (HR 1.59, 95% CI 1.26–1.99) compared to infections with an ancestral lineage. Following VOC infection, unvaccinated patients show a similar higher hospitalization risk, while vaccinated patients show no significant difference in risk, both when compared to unvaccinated, ancestral lineage cases. Interpretation: Infection with a VOC results in a higher hospitalization risk, with an active vaccination attenuating that risk. Our findings support promoting hospital preparedness, vaccination, and robust genomic surveillance.


2021 ◽  
Vol 39 (15_suppl) ◽  
pp. e18689-e18689
Author(s):  
Leah Wells ◽  
Michael Cerniglia ◽  
Audrey C. Jost ◽  
Gregory Joseph Britt

e18689 Background: While guidelines exist for appropriate use of chemotherapy in the metastatic setting based on performance status, such recommendations are less readily available for immune checkpoint inhibitors (ICIs). We sought to determine if there is a relationship between Eastern Cooperative Oncology Group (ECOG) performance status and outcomes on immunotherapy in patients treated for metastatic disease at our community-based oncology practice. Methods: 253 patients were identified as receiving nivolumab or pembrolizumab for stage IV malignancy at Cancer Centers of Colorado-SCL Health, between June 2018 and November 2020. Patients initiated on therapy after May 2020 were excluded from analysis, due to insufficient (less than 6 months) follow-up time. The remaining 183 patients were included in a retrospective cohort study comparing patients with ECOG 0, ECOG 1, and ECOG 2-4. Sex, age, type of cancer, and line of therapy were collected. Time on therapy was also calculated. Best response to therapy was determined (disease control or progressive disease). These baseline factors and outcomes were compared using ANOVA for numeric variables and chi-square tests of association for categorical variables. Time from initiation of ICI to death or hospice was also investigated and compared using a log-rank test. In addition, a multivariate Cox proportional hazards model was developed for the outcome, time to death/hospice, versus the predictors ECOG status, age, gender, and line of therapy. Hazard ratios (HR) and 95% confidence intervals (CI) were estimated. Results: Of the 183 patients included in analysis, 31.7% had an ECOG of 0, 48.6% an ECOG of 1, and 19.7% an ECOG of 2-4. Non-small cell lung cancer and melanoma represented the majority of patients in each group. Gender and line of therapy did not differ between groups. There was a significant difference in age (p = 0.02) with mean age 62, 66, and 70 in ECOG 0,1, and 2-4, respectively. 54.6% of patients remained on therapy for at least 6 months (182 days), and there was no significant difference between groups in ability to complete 6 months of therapy (p = 0.32). For ECOG 0, 1, and 2-4, disease control was achieved in 67.2%, 59.6 %, and 41.7%, respectively (p = 0.048). Analysis of time to death/hospice with a log rank test and Kaplan Meier plot showed a significant difference between groups (p < 0.001). A multivariate Cox proportional hazards model revealed that patients with ECOG 0 had significantly longer time to death/hospice compared to patients in both other groups, after controlling for age, gender, and line of therapy (ECOG 1 vs. 0: HR 2.5, CI 1.27-4.9; ECOG 2-4 vs. 0: HR 2.83, CI 1.31-6.13). Conclusions: In this single institution retrospective study of patients receiving nivolumab or pembrolizumab for metastatic cancer, ECOG 0 was associated with disease control and increased time before death or transition to hospice.


Sign in / Sign up

Export Citation Format

Share Document