MO547ASSOCIATION BETWEEN SERUM INDICES OF IRON METABOLISM AND CARDIOVASCULAR MORBIDITY IN PATIENTS WITH PREDIALYSIS CHRONIC KIDNEY DISEASE

2021 ◽  
Vol 36 (Supplement_1) ◽  
Author(s):  
Takeshi Hasegawa ◽  
Takahiro Imaizumi ◽  
Kenta Murotani ◽  
Takayuki Hamano ◽  
Masafumi Fukagawa

Abstract Background and Aims Patients with predialysis chronic kidney disease (CKD) have a greater risk of developing cardiovascular disease (CVD) events than the general population. Anaemia is the most frequent comorbidity in pre-dialysis CKD patients and is associated with an increase in CVD events. Iron deficiency is the most frequent cause of erythropoiesis-stimulating agents (ESAs) resistant anaemia in CKD patients and is modifiable by therapeutic intervention. However, the optimal ranges of iron markers are uncertain in predialysis CKD patients. Therefore, we aimed to investigate the association between serum indices of iron metabolism and the incidence of CVD events in patients with predialysis CKD using the CKD-Japan Cohort (CKD-JAC) data. Method We prospectively followed 1550 CKD patients aged 20-75 years with an estimated glomerular filtration rate (eGFR) <60 mL/min/1.73 m2 for a mean of 4.21 years. We set serum transferrin saturation (TSAT) and ferritin levels as the main exposures to be tested. Our main outcome measures were any of the CVD events including fatal or non-fatal myocardial infarction, congestive heart failure (CHF), angina pectoris, arrhythmia, aorta dissection, cerebrovascular disorder, and peripheral artery diseases identified at each facility and adjudicated by the independent cardiac function evaluation committee. Multivariable Cox proportional hazards regression models were employed to examine the association between serum TSAT or ferritin levels with time to events. Death was considered as a competing risk with the Fine and Gray model. All models were stratified by facilities and adjusted for potential confounders as follows: age, sex, systolic blood pressure, diabetes mellitus, history of CHF, haemoglobin, serum calcium, serum phosphorus, intact parathyroid hormone, eGFR, proteinuria, ESAs, iron supplementation, renin-angiotensin system inhibitors, and beta-blockers. We also applied the multivariable fractional polynomial interaction (MFPI) approach to investigate whether TSAT levels are the effect modifier of the association between iron supplementation and the outcomes. Results In the overall cohort, 208 (13.4 %) patients developed CVD events (including 97 CHF) during the follow-up period (26.6 events/1000 person-year). The incidence rate of CVD events was the highest in the TSAT < 20% category (33.0 events/1000 person-year). Compared to patients in the TSAT > 40% category, those in the TSAT < 20% category demonstrated an increased risk of CVD events (adjusted hazard ratio (AHR): 1.86, 95% confidence interval (CI): 1.06-3.26) and CHF events (AHR: 2.82, 95% CI: 1.15-6.89), respectively. Meanwhile, there was no association between serum ferritin levels and the risk of developing CVD or CHF events. MFPI analyses showed a reduced risk of CVD in patients receiving iron supplementation only in patients with TSAT <20% (P for interaction=0.02). Conclusion Maintaining TSAT >20% could be effective to reduce the risk of developing CVD events (especially CHF) in patients with predialysis CKD. Our analyses also suggest that iron-deficient patients with predialysis CKD may benefit from iron supplementation for reduced risk of CVD events.

2019 ◽  
Vol 12 (1) ◽  
pp. 77-82
Author(s):  
Ewa Kwiatkowska ◽  
Martyna Opara ◽  
Sebastian Kwiatkowski ◽  
Leszek Domański ◽  
Małgorzata Marchelek-Myśliwiec ◽  
...  

Background: According to the currently applicable KDIGO-2012 and ERBP 2013 guidelines, iron metabolism assessments for patients with Chronic Kidney Disease (CKD) are performed using such parameters as ferritin concentration and Transferrin Saturation (TSAT). Their values are to be treated as a basis on which to decide on providing iron substitution. Patients with Stage 5 CKD on maintenance hemodialysis commonly suffer from malnutrition syndrome and inflammation. One of the markers for malnutrition and inflammation is low transferrin concentration. Our study focused on establishing what percentage of patients this applied to and whether or not the transferrin saturation figure was artificially inflated in such cases. Materials and Methods: The study group included 66 patients with Stage 5 CKD on maintenance hemodialysis. Such data were analyzed as complete blood count, iron and ferritin concentrations, and Transferrin Saturation (TSAT). Other parameters - age, sex, time from their first hemodialysis, and the quality of their dialysis in the last six months – the Kt/V average. Results: It was found that only 12% of the study group patients had their transferrin concentrations above the lower limit of normal. The TSAT value correlated negatively with transferrin concentration. Transferrin concentration correlated negatively with time from first hemodialysis or ferritin concentration, and positively with body weight. Normal transferrin concentration was only seen in patients with ferritin concentrations of up to 400 μg/L. The group was divided according to transferrin concentration of <1.5 g/L or >1.5 g/L. These groups differed significantly in ferritin concentration and transferrin saturation. (p = 0.0005 and p = 0.004, respectively). The 1.5 g/L transferrin concentration point divides patients with mild and medium malnutrition. It is also the minimum transferrin content necessary to achieve hemoglobin values ≥10 g/dL determined using the ROC curve. Conclusion: Low transferrin concentrations cause abnormally high TSAT values. In most patients on maintenance hemodialysis, this marker is not useful for assessing the availability of iron for erythropoiesis.


2020 ◽  
Vol 21 (1) ◽  
Author(s):  
Suneela Zaigham ◽  
Anders Christensson ◽  
Per Wollmer ◽  
Gunnar Engström

Abstract Background Although the prevalence of kidney disease is higher in those with reduced lung function, the longitudinal relationship between low lung function and future risk of chronic kidney disease (CKD) has not been widely explored. Methods Baseline lung function was assessed in 20,700 men and 7325 women from 1974 to 1992. Mean age was 43.4 (±6.6) and 47.5 (±7.9) for men and women respectively. Sex-specific quartiles of FEV1 and FVC (L) were created (Q4: highest, reference) and the cohort was also divided by the FEV1/FVC ratio (≥ or < 0.70). Cox proportional hazards regression was used to determine the risk of incident CKD events (inpatient or outpatient hospital diagnosis of CKD) in relation to baseline lung function after adjustment for various confounding factors. Results Over 41 years of follow-up there were 710 and 165 incident CKD events (main diagnosis) in men and women respectively. Low FEV1 was strongly associated with future risk of CKD in men (Q1 vs Q4 adjusted HR: 1.46 (CI:1.14–1.89), p-trend 0.002). Similar findings were observed for FVC in men (1.51 (CI:1.16–1.95), p-trend 0.001). The adjusted risks were not found to be significant in women, for either FEV1 or FVC. FEV1/FVC < 0.70 was not associated with increased incidence of CKD in men or women. Conclusion Low FEV1 and FVC levels at baseline are a risk factor for the development of future incident CKD in men. Monitoring kidney function in those with reduced vital capacity in early life could help with identifying those at increased risk of future CKD.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Xin Li ◽  
Kristin Danielson ◽  
Innas Forsal ◽  
Ken Iseri ◽  
Lu Dai ◽  
...  

Abstract Background and Aims Transferrin saturation (TSAT) is an indicator of iron deficiency or overload, but its relationship with mortality in patients with different stages of chronic kidney disease (CKD) is unclear. We investigated the association of TSAT with mortality in CKD patients. Method In 479 CKD patients (97 CKD3-4 patients, 298 CKD5 non-dialysis patients and 84 peritoneal dialysis patients; median age 58 years, 67% males, 33% cardiovascular disease, CVD, and 29% diabetes), biomarkers of iron status (plasma iron, TSAT, transferrin and ferritin), systemic inflammation (high sensitivity C-reactive protein, hsCRP, and interleukin-6, IL-6) and nutritional status were assessed. During median follow-up of 35.6 months, 139 (29%) patients died, and 176 (37%) patients underwent renal transplantation. Patients were stratified into low (n=157) and high (n=322) TSAT tertile groups. All-cause and CVD mortality risk were analyzed with competing risk regression with renal transplantation as competing risk. Results TSAT [median 23% (IQR 17-30%)] was negatively associated with presence of DM and CVD, body mass index, hsCRP, IL-6, Framingham´s CVD risk score (FRS), erythropoietin resistance index (ERI) and iron supplementation, and positively associated with hemoglobin, ferritin and s-albumin. In competing risk analysis, low tertile of TSAT was independently associated with increased all-cause mortality risk (sHR=1.50, 95%CI 1.05-2.14) after adjusting for CKD stages, 1-SD of FRS, 1-SD of hemoglobin, 1-SD of hsCRP, 1-SD of ESA dose and iron supplementation (Figure 1). Conclusion TSAT was inversely associated with mortality risk in CKD patients. When evaluating clinical outcomes of CKD patients, iron status using TSAT as a predictive marker, should be considered.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Pablo Díez ◽  
Andres Fernández Ramos ◽  
Patricia Muñoz Ramos ◽  
Marta Sanz Sainz ◽  
Begoña Santos Sánchez-Rey ◽  
...  

Abstract Background and Aims Acute Kidney Injury (AKI) is one of the most frequent causes of hospitalization and many factors have been associated to its prognosis and recovery. The role of iron in AKI physiopathology and its influence is not well known. Recent studies have shown that elevated levels of catalytic iron are associated with higher mortality in patients with AKI, however, catalytic iron is not available in usual clinical practice. Ferritin, especially abundant in the liver, is the primary intracellular iron storage protein. A small amount is secreted to the circulation and is an indirect marker of total body iron deposits. In this study we analyze the influence of iron, with ferritin values, in the prognosis of AKI. Method We developed a retrospective, single-center study that enrolled patients with AKI, hospitalized in our center between 2013 and 2014 with iron metabolism values in the first 72 hours after admission. At baseline, we collected demographic information, comorbidities, reason for admission and iron metabolism values (ferritin, transferrin, transferrin saturation index and serum iron). We analyzed variables associated with low and high ferritin values and its impact in AKI long-term prognosis using univariate and multivariable Cox regression. Results Of the 1731 analyzed patients, 833 (48.1%) had ferritin records. The mean age was 78±14 years and 48% of the patients were women. The most frequent comorbidity was hypertension (76%), followed by chronic kidney disease (46%), dyslipidemia (44%), heart failure (31%), diabetes mellitus (29%) and atrial fibrillation (27%). The most frequent reason for admission was infection (35%) followed by AKI (19%). Ferritin values differed significantly according age (p&lt;0.0001), sex (p=0.024), diabetes (0.012), hypertension (p=0.002), neoplasia (p=0.016), reason for admission (p=0.018), baseline CKD-EPI (0.012) and lactate at admission (p&lt;0.0001). During the hospitalization, 165 (20%) patients died. Factor associated to mortality were ferritin&gt;500 ng/ml (p=0.013), lactate at admission (p&lt;0.001), age (p=0.045), hypertension (p=0.014), dyslipidemia (p&lt;0.001), ischemic heart disease (p=0.006), chronic kidney disease (p=0.001), baseline CKD-EPI (p=0.01), atrial fibrillation (p=0.005), neoplasia (p=0.023), Barthel index (p&lt;0.001) and hemoglobin (p=0.006) and bicarbonate (p=0.012) at admission. Multivariate logistic regression demonstrated that ferritin levels over 500 ng/mL was an independent predictor of mortality (1.6 [1,1-2,3] HR [CI 95%]) (p=0.013). Conclusion Ferritin values higher than 500 ng/mL independently predicts mortality in patients admitted with AKI.


Author(s):  
Carl P Walther ◽  
Wolfgang C Winkelmayer ◽  
Peter A Richardson ◽  
Salim S Virani ◽  
Sankar D Navaneethan

Abstract Background Treatment with renin–angiotensin system inhibitors (RASIs), angiotensin-converting enzyme inhibitors (ACEIs) and angiotensin II receptor blockers (ARBs) is the standard of care for those with chronic kidney disease (CKD) and albuminuria. However, ACEI/ARB treatment is often discontinued for various reasons. We investigated the association of ACEI/ARB discontinuation with outcomes among US veterans with non-dialysis-dependent CKD. Methods We performed a retrospective cohort study of patients in the Veterans Affairs healthcare system with non-dialysis-dependent CKD who subsequently were started on ACEI/ARB therapy (new user design). Discontinuation events were defined as a gap in ACEI/ARB therapy of ≥14 days and were classified further based on duration (14–30, 31–60, 61–90, 91–180 and &gt;180 days). This was treated as a time-varying risk factor in adjusted Cox proportional hazards models for the outcomes of death and incident end-stage kidney disease (ESKD), which also adjusted for relevant confounders. Results We identified 141 252 people with CKD and incident ACEI/ARB use who met the inclusion criteria; these were followed for a mean 4.87 years. There were 135 356 discontinuation events, 68 699 deaths and 6152 incident ESKD events. Discontinuation of ACEI/ARB was associated with a higher risk of death [hazard ratio (HR) 2.3, 2.0, 1.99, 1.92 and 1.74 for those discontinued for 14–30, 31–60, 61–90, 91–180 and &gt;180 days, respectively]. Similar associations were noted between ACEI and ARB discontinuation and ESKD (HR 1.64, 1.47, 1.54, 1.65 and 1.59 for those discontinued for 14–30, 31–60, 61–90, 91–180 and &gt;180 days, respectively). Conclusions In a cohort of predominantly male veterans with CKD Stages 3 and 4, ACEI/ARB discontinuation was independently associated with an increased risk of subsequent death and ESKD. This may be due to the severity of illness factors that drive the decision to discontinue therapy. Further investigations to determine the causes of discontinuations and to provide an evidence base for discontinuation decisions are needed.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Ensieh Memarian ◽  
Peter M. Nilsson ◽  
Isac Zia ◽  
Anders Christensson ◽  
Gunnar Engström

Abstract Background It has been shown that individuals with obesity have a higher risk for chronic kidney disease (CKD). However, it is unclear which measure of obesity is most useful for prediction of CKD in the general population. The aim of this large prospective study was to explore the association between several anthropometric measures of obesity, i. e., body mass index (BMI), waist circumference (WC), waist circumference to height ratio (WHtR), waist-to-hip ratio (WHR), percentage of body fat (BF%), weight, height and incidence of hospitalizations due to CKD, in a population-based cohort study. Methods The ‘Malmö Diet and Cancer Study (MDCS)’ cohort in Sweden was examined during 1991 to 1996. A total of 28,449 subjects underwent measurement of anthropometric measures and blood pressure and filled out a questionnaire. Incidence of in- and outpatient hospital visits for CKD was monitored from the baseline examination over a mean follow-up of 18 years. Cox proportional hazards regression was used to explore the association between anthropometric measures and incidence of CKD, with adjustments for risk factors. Results The final study population included 26,723 subjects, 45-73 years old at baseline. Higher values of BMI, WC, WHR, WHtR and weight were associated with an increased risk of developing CKD in both men and women. Only in women, higher values of BF% was associated with higher risk of CKD. Comparing the 4th vs 1st quartile of the obesity measure, the highest hazard ratio (HR) for CKD in men was observed for BMI, HR 1.51 (95% CI: 1.18-1.94) and weight (HR 1.52 (95% CI: 1.19-1.94). For women the highest HR for CKD was observed for BF%, HR 2.01 (95% CI: 1.45-2.78). Conclusions In this large prospective study, all anthropometric measures of obesity were associated with a substantially increased incidence of CKD, except for BF% in men. Some measures were slightly more predictive for the risk of CKD than others such as BMI and weight in men and BF% in women. In clinical daily practice use of all anthropometric measures of obesity might be equally useful to assess the risk of developing CKD. This study supports the strong evidence for an association between obesity and CKD.


2021 ◽  
Vol 11 ◽  
Author(s):  
Chen-Yi Liao ◽  
Chi-Hsiang Chung ◽  
Kuo-Cheng Lu ◽  
Cheng-Yi Cheng ◽  
Sung-Sen Yang ◽  
...  

Background: Sleeping disorder has been associated with chronic kidney disease (CKD); however, the correlation between sleeping pills use and CKD has not been investigated in-depth yet. This study elucidated the potential association of sleeping pill use with the risk of CKD and CKD progression to end-stage renal disease (ESRD) requiring dialysis.Methods: This study was based on a population-based cohort that included 209,755 sleeping pill users among 989,753 individuals. After applying the exclusion criteria, 186,654 sleeping pill users and 373,308 nonusers were enrolled to monitor the occurrence of CKD. Using a cumulative daily dose, we analyzed the types of sleeping pills related to the risk of CKD and ESRD. Propensity score matching and analysis using Cox proportional hazards regression were performed with adjustments for sex, age, and comorbidities.Results: Sleeping pill use was related to increased CKD risk after adjusting for underlying comorbidities (adjusted hazard ratio [aHR] = 1.806, 95% confidence interval [CI]: 1.617–2.105, p &lt; 0.001). With the exception of hyperlipidemia, most comorbidities correlated with an increased risk of CKD. Persistent use of sleeping pills after CKD diagnosis increased the risk of concurrent ESRD (aHR = 7.542; 95% CI: 4.267–10.156; p &lt; 0.001). After the subgroup analysis for sleeping pill use, brotizolam (p = 0.046), chlordiazepoxide (p &lt; 0.001), clonazepam (p &lt; 0.001), diazepam (p &lt; 0.001), dormicum (p &lt; 0.001), estazolam (p &lt; 0.001), fludiazepam (p &lt; 0.001), flunitrazepam (p &lt; 0.001), nitrazepam (p &lt; 0.001), trazodone (p &lt; 0.001), zolpidem (p &lt; 0.001), and zopiclone (p &lt; 0.001) were found to have significant correlation with increased CKD risk.Conclusion: Sleeping pill use was related to an increased risk of CKD and ESRD. Further studies are necessary to corroborate these findings.


2020 ◽  
Author(s):  
Xin Li ◽  
Kristin Danielson ◽  
Innas Forsal ◽  
Ken Iseri ◽  
Lu Dai ◽  
...  

Abstract Background: Transferrin saturation (TSAT) is an indicator of iron deficiency or overload, but its relationship with mortality in patients with different stages of chronic kidney disease (CKD) is unclear. We investigated the association of TSAT with mortality in CKD patients. Methods: In 479 CKD patients (97 CKD3-4 patients, 298 CKD5 non-dialysis patients and 84 peritoneal dialysis patients; median age 58 years, 67% males, 33% cardiovascular disease, CVD, and 29% diabetes), biomarkers of iron status (plasma iron, TSAT, transferrin and ferritin), systemic inflammation (high sensitivity C-reactive protein, hsCRP, and interleukin-6, IL-6) and nutritional status were assessed. During median follow-up of 35.6 months, 139 (29%) patients died, and 176 (37%) patients underwent renal transplantation. Patients were stratified into Low (n=157) and Middle and high (n=322) TSAT tertile groups. All-cause and CVD mortality risk were analyzed by competing risk regression with renal transplantation as competing risk. Results: TSAT (median 23%; interquartile range, 17-30%) was negatively associated with presence of diabetes and CVD, body mass index, hsCRP, IL-6, erythropoiesis stimulating agent (ESA) dose, erythropoietin resistance index (ERI) and iron supplementation, and positively associated with hemoglobin, ferritin and s-albumin. In competing risk analysis, low tertile of TSAT was independently associated with increased all-cause mortality risk (sHR=1.74, 95%CI 1.30-2.54) and CVD mortality risk (sHR=1.80, 95%CI 1.02-3.16) after fully adjusting for 1-standard deviation (SD) of age, sex, CKD stages, 1-SD of hemoglobin, 1-SD of ferritin, 1-SD of hsCRP, 1-SD of ESA dose and iron supplementation. Conclusions: Lower TSAT indicating iron deficiency was independently associated with increased mortality risk in CKD patients, underlining that iron status should be considered when evaluating clinical outcomes of CKD patients.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Pablo Molina ◽  
Mariola D Molina ◽  
Luis M Pallardó ◽  
Ramón López-Menchero ◽  
Francisco Javier Torralba ◽  
...  

Abstract Background and Aims Abnormalities of bone mineral parameters are associated with increased mortality in patients on dialysis, but their effects and the optimal range of these biomarkers are less well characterized in non-dialysis chronic kidney disease (CKD). Method PECERA (Collaborative Study Project in Patients with Advanced Chronic Kidney Disease) is a 3-year, multicentre, open-cohort, prospective study carried out in 995 adult patients with CKD stages 4-5 not on dialysis enrolled in 2007-09 from 12 centres in Spain. Associations between levels of serum calcium (corrected for serum albumin), phosphate, and intact parathyroid hormone (iPTH) and all-cause mortality were examined using time-dependent Cox proportional hazards models and penalized splines analysis adjusted by demographics and comorbidities, treatments and biochemical collected at baseline and every 6 months for 3 years. Results After a median follow-up of 30 months (IQR:14-37 months) there were 180 deaths (18%). The association of calcium and phosphate with all-cause mortality was U-shaped (Figure). The serum values associated with the minimum risk of mortality were 9.35 mg/dL for calcium and 3.56 mg/dL for phosphate, being the lowest risk ranges between 7.4 to 10.7 mg/dL and between 2.3 to 4.6 mg/dL for calcium and phosphate, respectively. For iPTH levels, the association was J-shaped, with an increased risk for all-cause mortality at levels &gt; 110 pg/mL. Conclusion As previously reported in dialysis patients, PECERA provided evidence on the association of serum calcium, phosphate and iPTH levels with all-cause mortality in stage 4 and 5 CKD patients, suggesting potential survival benefits of controlling bone mineral parameters in this population. Whereas the ranges of calcium and phosphate associated with the lowest mortality in the study were consistent with the current K-DIGO guidelines, our results suggested that the threshold for considering anti-parathyroid treatment might be lower than is currently recommended.


2021 ◽  
pp. 1-12
Author(s):  
Kuang-Yu Wei ◽  
Chen-Yi Liao ◽  
Chi-Hsiang Chung ◽  
Fu-Huang Lin ◽  
Chang-Huei Tsao ◽  
...  

<b><i>Introduction:</i></b> Patients with carbon monoxide poisoning (COP) commonly have long-term morbidities. However, it is not known whether patients with COP exhibit an increased risk of developing chronic kidney disease (CKD) and whether hyperbaric oxygen therapy (HBOT) alters this risk. <b><i>Methods:</i></b> This study identified 8,618 patients who survived COP and 34,464 propensity score-matched non-COP patients from 2000 to 2013 in a nationwide administrative registry. The primary outcome was the development of CKD. The association between COP and the risk of developing CKD was estimated using a Cox proportional hazards regression model; the cumulated incidence of CKD among patients stratified by HBOT was evaluated using a Kaplan-Meier analysis. <b><i>Results:</i></b> After adjusting for covariates, the risk of CKD was 6.15-fold higher in COP patients than in non-COP controls. Based on the subgroup analyses, regardless of demographic characteristics, environmental factors, and comorbidities, the COP cohort exhibited an increased risk of developing CKD compared with the controls. The cumulative incidence of CKD in COP patients did not differ between the HBOT and non-HBOT groups (<i>p</i> = 0.188). <b><i>Conclusions:</i></b> COP might be an independent risk factor for developing CKD. Thus, clinicians should enhance the postdischarge follow-up of kidney function among COP patients.


Sign in / Sign up

Export Citation Format

Share Document