scholarly journals The Effect of Glycated Hemoglobin and Albumin-Corrected Glycated Serum Protein on Mortality in Diabetic Patients Receiving Continuous Peritoneal Dialysis

2015 ◽  
Vol 35 (5) ◽  
pp. 566-575 ◽  
Author(s):  
Fenfen Peng ◽  
Xi Xia ◽  
Feng He ◽  
Zhijian Li ◽  
Fengxian Huang ◽  
...  

Objective To explore the effect of glycated hemoglobin (HbA1c) and albumin-corrected glycated serum proteins (Alb-GSP) on the mortality of diabetic patients receiving continuous peritoneal dialysis (PD). Methods In this single-center retrospective cohort study, incident diabetic PD patients from January 1, 2006, to December 31, 2010, were recruited, and followed up until December 31, 2011. The effect of HbA1c and Alb-GSP on mortality was evaluated by Cox proportional hazards models. Results A total of 200 patients (60% male, mean age 60.3 ± 10.6 years) with a mean follow-up of 29.0 months (range: 4.3 - 71.5 months) were recruited. Sixty-four patients died during the follow-up period, of whom 21 died of cardiovascular disease (CVD). Mean values for HbA1c, GSP and Alb-GSP were 6.7% (range: 4.1 - 12.5%), 202 μmol/L (range: 69 - 459 μmol/L), and 5.78 μmol/g (range: 2.16 - 14.98 μmol/g), respectively. The concentrations of GSP and Alb-GSP were closely correlated with HbA1c ( r = 0.41, p < 0.001 and r = 0.45, p < 0.001, respectively). In multivariate Cox proportional hazards models, patients with HbA1c ≥8% were associated with increased risk of all-cause mortality (hazard ratio [HR] = 2.29, 95% confidence interval [CI]: 1.06 - 4.96, p = 0.04), but no increased mortality in patients with 6.0% ≤ HbA1c ≤ 7.9%. Patients with Alb-GSP ≤ 4.50 μmol/g had increased all-cause and non-cardiovascular mortality (HR = 2.42, 95% CI: 1.13 - 5.19, p = 0.02; and HR = 2.98, 95% CI: 1.05 - 8.48, p = 0.04 respectively). Conclusions Increased HbA1c and decreased Alb-GSP may be associated with poorer survival in diabetic PD patients, with a non-significant trend observed for poorer survival with the highest level of Alb-GSP.

Stroke ◽  
2020 ◽  
Vol 51 (Suppl_1) ◽  
Author(s):  
Adam H de Havenon ◽  
Ka-Ho Wong ◽  
Eva Mistry ◽  
Mohammad Anadani ◽  
Shadi Yaghi ◽  
...  

Background: Increased blood pressure variability (BPV) has been associated with stroke risk, but never specifically in patients with diabetes. Methods: This is a secondary analysis of the Action to Control Cardiovascular Risk in Diabetes Follow-On Study (ACCORDION), the long term follow-up extension of ACCORD. Visit-to-visit BPV was analyzed using all BP readings during the first 36 months. The primary outcome was incident ischemic or hemorrhagic stroke after 36 months. Differences in mean BPV was tested with Student’s t-test. We fit Cox proportional hazards models to estimate the adjusted risk of stroke across lowest vs. highest quintile of BPV and report hazard ratios along with 95% confidence intervals (CI). Results: Our analysis included 9,241 patients, with a mean (SD) age of 62.7 (6.6) years and 61.7% were male. Mean (SD) follow-up was 5.7 (2.4) years and number of BP readings per patient was 12.0 (4.3). Systolic, but not diastolic, BPV was higher in patients who developed stroke (Table 1). The highest quintile of SBP SD was associated with increased risk of incident stroke, independent of mean blood pressure or other potential confounders. (Table 2, Figure 1). There was no interaction between SBP SD and treatment arm assignment, although the interaction for glucose approached significance (Table 2). Conclusion: Higher systolic BPV was associated with incident stroke in a large cohort of diabetic patients. Future trials of stroke prevention may benefit from interventions targeting BPV reduction.


2015 ◽  
Vol 40 (2) ◽  
pp. 160-166 ◽  
Author(s):  
Liping Xiong ◽  
Li Fan ◽  
Qingdong Xu ◽  
Qian Zhou ◽  
Huiyan Li ◽  
...  

Background: There are limited data regarding the relationship between transport status and mortality in anuric continuous ambulatory peritoneal dialysis (CAPD) patients. Methods: According to the dialysate to plasma creatinine ratio (D/P Cr), 292 anuric CAPD patients were stratified to faster (D/P Cr ≥0.65) and slower transport groups (D/P Cr <0.65). The Cox proportional hazards models were used to evaluate the association of transport status with mortality. Results: During a median follow-up of 22.1 months, 24% patients died, 61.4% of them due to cardiovascular disease (CVD). Anuric patients with faster transport were associated with an increased risk of all-cause mortality (HR (95% CI) = 2.16 (1.09-4.26)), but not cardiovascular mortality, after adjustment for confounders. Faster transporters with pre-existing CVD had a greater risk for death compared to those without any history of CVD. Conclusion: Faster transporters were independently associated with high all-cause mortality in anuric CAPD patients. This association was strengthened in patients with pre-existing CVD.


Author(s):  
Ma Cherrysse Ulsa ◽  
Xi Zheng ◽  
Peng Li ◽  
Arlen Gaba ◽  
Patricia M Wong ◽  
...  

Abstract Background Delirium is a distressing neurocognitive disorder recently linked to sleep disturbances. However, the longitudinal relationship between sleep and delirium remains unclear. This study assessed the associations of poor sleep burden, and its trajectory, with delirium risk during hospitalization. Methods 321,818 participants from the UK Biobank (mean age 58±8y[SD]; range 37-74y) reported (2006-2010) sleep traits (sleep duration, excessive daytime sleepiness, insomnia-type complaints, napping, and chronotype–a closely-related circadian measure for sleep timing), aggregated into a sleep burden score (0-9). New-onset delirium (n=4,775) was obtained from hospitalization records during 12y median follow-up. 42,291 (mean age 64±8; range 44-83y) had repeat sleep assessment on average 8y after their first. Results In the baseline cohort, Cox proportional hazards models showed that moderate (aggregate scores=4-5) and severe (scores=6-9) poor sleep burden groups were 18% (hazard ratio 1.18 [95% confidence interval 1.08-1.28], p&lt;0.001) and 57% (1.57 [1.38-1.80], p&lt;0.001), more likely to develop delirium respectively. The latter risk magnitude is equivalent to two additional cardiovascular risks. These findings appeared robust when restricted to postoperative delirium and after exclusion of underlying dementia. Higher sleep burden was also associated with delirium in the follow-up cohort. Worsening sleep burden (score increase ≥2 vs. no change) further increased the risk for delirium (1.79 [1.23-2.62], p=0.002) independent of their baseline sleep score and time-lag. The risk was highest in those under 65y at baseline (p for interaction &lt;0.001). Conclusion Poor sleep burden and worsening trajectory were associated with increased risk for delirium; promotion of sleep health may be important for those at higher risk.


Author(s):  
Thomas J Littlejohns ◽  
Shabina Hayat ◽  
Robert Luben ◽  
Carol Brayne ◽  
Megan Conroy ◽  
...  

Abstract Visual impairment has emerged as a potential modifiable risk factor for dementia. However, there are a lack of large studies with objective measures of vison and with more than ten years of follow-up. We investigated whether visual impairment is associated with an increased risk of incident dementia in UK Biobank and EPIC-Norfolk. In both cohorts, visual acuity was measured using a “logarithm of the minimum angle of resolution” (LogMAR) chart and categorised as no (≤0.30 LogMAR), mild (&gt;0.3 - ≤0.50 LogMAR), and moderate to severe (&gt;0.50 LogMAR) impairment. Dementia was ascertained through linkage to electronic medical records. After restricting to those aged ≥60 years, without prevalent dementia and with eye measures available, the analytic samples consisted of 62,206 UK Biobank and 7,337 EPIC-Norfolk participants, respectively. In UK Biobank and EPIC-Norfolk. respectively, 1,113 and 517 participants developed dementia over 11 and 15 years of follow-up. Using multivariable cox proportional-hazards models, the hazard ratios for mild and moderate to severe visual impairment were 1.26 (95% Confidence Interval [CI] 0.92-1.72) and 2.16 (95% CI 1.37-3.40), in UK Biobank, and 1.05 (95% CI 0.72-1.53) and 1.93 (95% CI 1.05-3.56) in EPIC-Norfolk, compared to no visual impairment. When excluding participants censored within 5 years of follow-up or with prevalent poor or fair self-reported health, the direction of the associations remained similar for moderate impairment but were not statistically significant. Our findings suggest visual impairment might be a promising target for dementia prevention, however the possibility of reverse causation cannot be excluded.


EP Europace ◽  
2020 ◽  
Vol 22 (Supplement_1) ◽  
Author(s):  
P Krisai ◽  
O Streicher ◽  
P Meyre ◽  
P Haemmerle ◽  
F Steiner ◽  
...  

Abstract Background Atrial fibrillation (AF) is a common finding in patients undergoing cavotricuspid isthmus ablation for isthmus dependent right atrial flutter (RAF). Little is known about the time of its occurrence. Purpose We aimed to investigate the incidence of AF early after RAF ablation in a well-defined, prospective cohort. Methods A total of 255 participants with RAF ablation from 5 centers and at least one completed follow-up were included. Structured clinical follow-up was performed at 3, 6 and 12 months including a 24 hour Holter-ECG. The endpoint was incidence of AF detected clinically or by Holter-ECG. Risk factors associated with the occurrence of AF were assessed using separate, univariate Cox proportional-hazards models. Results Mean age was 67 years, 80% were male and previous episodes of AF were known in 40%. Over a mean follow-up of 7.4 (±4.4) months AF was detected in 35 (13.7%) participants after RAF ablation (Figure A). After 3, 6 and 12 months AF was detected in 18 (7.1%), 30 (11.7%) and 34 (13.3%) patients. No difference in the incidence of AF after RAF ablation was found comparing patients with and without a history of AF (log-rank p value = 0.44) (Figure B). Comparing patients with and without AF during follow-up, there was no difference in age (68 vs 66 years, p = 0.36), sex (69 vs 81% male, p = 0.08), prior heart failure (29 vs 19%, p = 0.20), hypertension (43 vs 38%, p = 0.56) or left atrial volume (46.6 vs 39.6 ml, p = 0.10), but patients with previous AF had a lower left ventricular ejection fraction (LVEF) (45.7 vs 52.3%, p = 0.02). In separate, univariate Cox proportional-hazards models only increasing LVEF (Hazard ratio 0.97, 95% confidence interval (0.95; 0.99, p = 0.02)) was associated with a lower risk of incident AF after RAF ablation, but no other risk factor. Conclusions AF occurred in 13.7% of patients early after cavotricuspid isthmus ablation for RAF. There was no difference in the occurrence of AF between patients with and without previously known episodes of AF. Only impaired LVEF was associated with AF occurrence. Abstract Figure


2007 ◽  
Vol 25 (18_suppl) ◽  
pp. 1018-1018 ◽  
Author(s):  
M. C. Pinder ◽  
H. Chang ◽  
K. R. Broglio ◽  
L. B. Michaud ◽  
R. L. Theriault ◽  
...  

1018 Background: In the era of trastuzumab, HER2-positive breast cancer confers an increased risk of central nervous system (CNS) metastases. While several studies have examined CNS metastases in trastuzumab-treated patients, data are sparse regarding CNS metastases in trastuzumab-naïve HER2-positive patients. We evaluated time to CNS metastasis, death, and death subsequent to brain metastasis in relation to trastuzumab treatment. Methods: The study population included 750 patients diagnosed with HER2-positive metastatic breast cancer (HER2+ MBC) between June 1977 and January 2006. The association between trastuzumab treatment and the outcomes of time to CNS metastasis and time to death following CNS metastasis were determined using Cox proportional hazards models that included trastuzumab treatment as a time-dependent covariate. Multivariable Cox proportional hazards models were fit to determine the association between trastuzumab treatment and outcomes after adjustment for known prognostic factors. Patients with HER2+ MBC treated at our institution before trastuzumab was available served as our control group. Results: Of the 750 patients included, 689 patients received trastuzumab during the follow-up period while 61 patients were not treated with trastuzumab. Median follow-up was 32 months. A total of 251 patients developed CNS metastases. After adjusting for other prognostic variables including age, ER status, PR status, pathological stage, and site of initial metastasis, patients who received trastuzumab had 2.84 times the risk of CNS metastases (95 % CI = 1.87, 4.30, p < 0.0001) compared to patients who did not receive trastuzumab. Time to death following brain metastasis did not differ significantly between trastuzumab- treated and -untreated patients. Conclusions: In our large series, patients with HER2+ MBC treated with trastuzumab were at significantly increased risk of developing CNS metastases compared to patients who did not receive trastuzumab. This finding warrants further investigation into biological mechanisms that may account for this difference. No significant financial relationships to disclose.


2009 ◽  
Vol 27 (15_suppl) ◽  
pp. 9625-9625
Author(s):  
J. A. Berlin ◽  
P. J. Bowers ◽  
S. Rao ◽  
S. Sun ◽  
K. Liu ◽  
...  

9625 Background: When cancer patients (pts) with chemotherapy-induced anemia (CIA) respond to erythropoietic-stimulating agents (ESA), hemoglobin (Hb) typically increases within 4–8 weeks. This exploratory analysis examined whether mortality differs depending on Hb response after 4 or 8 weeks of epoetin alfa (EPO) treatment or depending on transfusion. Methods: Pt-level data were analyzed from 31 randomized studies (7,215 pts) of epoetin alfa vs non-EPO (15 studies) or placebo (16 studies) in pts with CIA. A landmark analysis was used; Hb change was set at a specific time (4 and 8 weeks) and subsequent survival was examined separately for EPO and placebo. Pts were categorized as “Hb increased” (>0.5 g/dL), “Hb decreased” (>0.5 g/dL), or “Hb stable” (within ±0.5 g/dL) compared to baseline. Hb stable was compared to other Hb change categories with Cox proportional hazards models, stratified by study and adjusted for potential confounders. Results: The hazard ratio (HR) for Hb decreased versus Hb stable at 4 weeks was 1.44 for EPO (95% CI: 1.04, 1.99), indicating worse survival for pts with a decline in Hb. This association was weaker for placebo (HR: 1.12; 95% CI: 0.74, 1.67). Increased risk with declining Hb in EPO-treated pts was most pronounced in studies that maintained Hb ≥12 g/dL or treated pts for >12–16 weeks (1,876 pts). Patterns were similar using the 8-week landmark. In both EPO-treated and placebo pts, transfusion increased the rate of on-study death ∼3.5 fold (treating transfusion as a time-dependent variable). Conclusions: These exploratory findings suggest that both decreased Hb after 4 or 8 weeks of EPO treatment and transfusion are associated with increased risk of death. In spite of adjustment for other prognostic factors, it is likely that this association reflects poorer underlying prognosis of pts whose Hb fails to respond. ESAs should be discontinued in the absence of a Hb response. [Table: see text]


2016 ◽  
Vol 43 (2) ◽  
pp. 104-111 ◽  
Author(s):  
Dandara N. Spigolon ◽  
Thyago P. de Moraes ◽  
Ana E. Figueiredo ◽  
Ana Paula Modesto ◽  
Pasqual Barretti ◽  
...  

Background: Structured pre-dialysis care is associated with an increase in peritoneal dialysis (PD) utilization, but not with peritonitis risk, technical and patient survival. This study aimed at analyzing the impact of pre-dialysis care on these outcomes. Methods: All incident patients starting PD between 2004 and 2011 in a Brazilian prospective cohort were included in this analysis. Patients were divided into 2 groups: early pre-dialysis care (90 days of follow-up by a nephrology team); and late pre-dialysis care (absent or less than 90 days follow-up). The socio-demographic, clinical and biochemical characteristics between the 2 groups were compared. Risk factors for the time to the first peritonitis episode, technique failure and mortality based on Cox proportional hazards models. Results: Four thousand one hundred seven patients were included. Patients with early pre-dialysis care presented differences in gender (female - 47.0 vs. 51.1%, p = 0.01); race (white - 63.8 vs. 71.7%, p < 0.01); education (<4 years - 61.9 vs. 71.0%, p < 0.01), respectively, compared to late care. Patients with early pre-dialysis care presented a higher prevalence of comorbidities, lower levels of creatinine, phosphorus, and glucose with a significantly better control of hemoglobin and potassium serum levels. There was no impact of pre-dialysis care on peritonitis rates (hazard ratio (HR) 0.88; 95% CI 0.77-1.01) and technique survival (HR 1.12; 95% CI 0.92-1.36). Patient survival (HR 1.20; 95% CI 1.03-1.41) was better in the early pre-dialysis care group. Conclusion: Earlier pre-dialysis care was associated with improved patient survival, but did not influence time to the first peritonitis nor technique survival in this national PD cohort.


RMD Open ◽  
2018 ◽  
Vol 4 (2) ◽  
pp. e000670 ◽  
Author(s):  
Isabelle A Vallerand ◽  
Ryan T Lewinson ◽  
Alexandra D Frolkis ◽  
Mark W Lowerison ◽  
Gilaad G Kaplan ◽  
...  

ObjectivesMajor depressive disorder (MDD) is associated with increased levels of systemic proinflammatory cytokines, including tumour necrosis factor alpha. As these cytokines are pathogenic in autoimmune diseases such as rheumatoid arthritis (RA), our aim was to explore on a population-level whether MDD increases the risk of developing RA.MethodsA retrospective cohort study was conducted using The Health Improvement Network (THIN) database (from 1986 to 2012). Observation time was recorded for both the MDD and referent cohorts until patients developed RA or were censored. Cox proportional hazards models were used to determine the risk of developing RA among patients with MDD, accounting for age, sex, medical comorbidities, smoking, body mass index and antidepressant use.ResultsA cohort of 403 932 patients with MDD and a referent cohort of 5 339 399 patients without MDD were identified in THIN. Cox proportional hazards models revealed a 31% increased risk of developing RA among those with MDD in an unadjusted model (HR=1.31, 95% CI 1.25 to 1.36, p<0.0001). When adjusting for all covariates, the risk remained significantly increased among those with MDD (HR=1.38, 95% CI 1.31 to 1.46, p<0.0001). Antidepressant use demonstrated a confounding effect that was protective on the association between MDD and RA.ConclusionMDD increased the risk of developing RA by 38%, and antidepressants may decrease this risk in these patients. Future research is necessary to confirm the underlying mechanism of MDD on the pathogenesis of RA.


2017 ◽  
Vol 45 (1-3) ◽  
pp. 28-35
Author(s):  
Rong Rong ◽  
Qian Zhou ◽  
Jianxiong Lin ◽  
Naya Huang ◽  
Wei Li ◽  
...  

Background: The association between folic acid (FA) supplementation and mortality in continuous ambulatory peritoneal dialysis (CAPD) patients is unclear. Methods: FA exposure was calculated as a percentage of cumulative duration of drug usage to total follow-up duration (FA%). A total of 1,358 patients were classified by a cutoff value of FA%. The association of FA with mortality was evaluated using Cox proportional hazards models. Results: The cutoff value of FA% for predicting mortality was <34% at a median follow-up of 40.7 months. FA ≥34% was associated with decreased risk for all-cause (adjusted hazard ratios [HRs] 95% CI 0.64 [0.48-0.85] and cardiovascular mortality 0.67 (95% CI 0.47-0.97). Moreover, the adjusted HRs per 10% higher FA for all-cause and cardiovascular mortality were 0.925 (95% CI 0.879-0.973) and 0.926 (95% CI 0.869-0.988), respectively. Conclusions: Longer period of FA supplementation led to a reduction in risk of both all-cause and cardiovascular mortality in CAPD patients.


Sign in / Sign up

Export Citation Format

Share Document