scholarly journals Serum Potassium Levels at Hospital Discharge and One-Year Mortality among Hospitalized Patients

Medicina ◽  
2020 ◽  
Vol 56 (5) ◽  
pp. 236 ◽  
Author(s):  
Charat Thongprayoon ◽  
Wisit Cheungpasitporn ◽  
Sorkko Thirunavukkarasu ◽  
Tananchai Petnak ◽  
Api Chewcharat ◽  
...  

Background and Objectives: The optimal range of serum potassium at hospital discharge is unclear. The aim of this study was to assess the relationship between discharge serum potassium levels and one-year mortality in hospitalized patients. Materials and Methods: All adult hospital survivors between 2011 and 2013 at a tertiary referral hospital, who had available admission and discharge serum potassium data, were enrolled. End-stage kidney disease patients were excluded. Discharge serum potassium was defined as the last serum potassium level measured within 48 h prior to hospital discharge and categorized into ≤2.9, 3.0–3.4, 3.5–3.9, 4.0–4.4, 4.5–4.9, 5.0–5.4 and ≥5.5 mEq/L. A Cox proportional hazards analysis was performed to assess the independent association between discharge serum potassium and one-year mortality after hospital discharge, using the discharge potassium range of 4.0–4.4 mEq/L as the reference group. Results: Of 57,874 eligible patients, with a mean discharge serum potassium of 4.1 ± 0.4 mEq/L, the estimated one-year mortality rate after discharge was 13.2%. A U-shaped association was observed between discharge serum potassium and one-year mortality, with the nadir mortality in the discharge serum potassium range of 4.0–4.4 mEq/L. After adjusting for clinical characteristics, including admission serum potassium, both discharge serum potassium ≤3.9 mEq/L and ≥4.5 mEq/L were significantly associated with increased one-year mortality, compared with the discharge serum potassium of 4.0–4.4 mEq/L. Stratified analysis based on admission serum potassium showed similar results, except that there was no increased risk of one-year mortality when discharge serum potassium was ≤3.9 mEq/L in patients with an admission serum potassium of ≥5.0 mEq/L. Conclusion: The association between discharge serum potassium and one-year mortality after hospital discharge had a U-shaped distribution and was independent of admission serum potassium. Favorable survival outcomes occurred when discharge serum potassium was strictly within the range of 4.0–4.4 mEq/L.

Medicines ◽  
2019 ◽  
Vol 7 (1) ◽  
pp. 2 ◽  
Author(s):  
Charat Thongprayoon ◽  
Wisit Cheungpasitporn ◽  
Panupong Hansrivijit ◽  
Michael A. Mao ◽  
Juan Medaura ◽  
...  

Background: The aim of this study was to assess the relationship between admission serum potassium and one-year mortality in all adult hospitalized patients. Methods: All adult hospitalized patients who had an admission serum potassium level between the years 2011 and 2013 at a tertiary referral hospital were enrolled. End-stage kidney disease patients were excluded. Admission serum potassium was categorized into levels of ≤2.9, 3.0–3.4, 3.5–3.9, 4.0–4.4, 4.5–4.9, 5.0–5.4, and ≥5.5 mEq/L. Cox proportional hazard analysis was performed to assess the independent association between admission serum potassium and one-year mortality after hospital admission, using an admission potassium level of 4.0–4.4 mEq/L as the reference group. Results: A total of 73,983 patients with mean admission potassium of 4.2 ± 0.5 mEq/L were studied. Of these, 12.6% died within a year after hospital admission, with the lowest one-year mortality associated with an admission serum potassium of 4.0–4.4 mEq/L. After adjustment for age, sex, race, estimated glomerular filtration rate (eGFR), principal diagnosis, comorbidities, medications, acute kidney injury, mechanical ventilation, and other electrolytes at hospital admission, both a low admission serum potassium ≤3.9 mEq/L and elevated admission potassium ≥5.0 mEq/L were significantly associated with an increased risk of one-year mortality, when compared with an admission serum potassium of 4.0–4.4 mEq/L. Subgroup analysis of chronic kidney disease and cardiovascular disease patients showed similar results. Conclusion: This study demonstrated that hypokalemia ≤3.9 mEq/L and hyperkalemia ≥5.0 mEq/L at the time of hospital admission were associated with higher one-year mortality.


2020 ◽  
Vol 8 (2) ◽  
pp. 22 ◽  
Author(s):  
Tananchai Petnak ◽  
Charat Thongprayoon ◽  
Wisit Cheungpasitporn ◽  
Tarun Bathini ◽  
Saraschandra Vallabhajosyula ◽  
...  

This study aimed to assess the one-year mortality risk based on discharge serum chloride among the hospital survivors. We analyzed a cohort of adult hospital survivors at a tertiary referral hospital from 2011 through 2013. We categorized discharge serum chloride; ≤96, 97–99, 100–102, 103–105, 106–108, and ≥109 mmoL/L. We performed Cox proportional hazard analysis to assess the association of discharge serum chloride with one-year mortality after hospital discharge, using discharge serum chloride of 103–105 mmoL/L as the reference group. Of 56,907 eligible patients, 9%, 14%, 26%, 28%, 16%, and 7% of patients had discharge serum chloride of ≤96, 97–99, 100–102, 103–105, 106–108, and ≥109 mmoL/L, respectively. We observed a U-shaped association of discharge serum chloride with one-year mortality, with nadir mortality associated with discharge serum chloride of 103–105 mmoL/L. When adjusting for potential confounders, including discharge serum sodium, discharge serum bicarbonate, and admission serum chloride, one-year mortality was significantly higher in both discharge serum chloride ≤99 hazard ratio (HR): 1.45 and 1.94 for discharge serum chloride of 97–99 and ≤96 mmoL/L, respectively; p < 0.001) and ≥109 mmoL/L (HR: 1.41; p < 0.001), compared with discharge serum chloride of 103–105 mmoL/L. The mortality risk did not differ when discharge serum chloride ranged from 100 to 108 mmoL/L. Of note, there was a significant interaction between admission and discharge serum chloride on one-year mortality. Serum chloride at hospital discharge in the optimal range of 100–108 mmoL/L predicted the favorable survival outcome. Both hypochloremia and hyperchloremia at discharge were associated with increased risk of one-year mortality, independent of admission serum chloride, discharge serum sodium, and serum bicarbonate.


2021 ◽  
Vol 39 (6_suppl) ◽  
pp. 94-94
Author(s):  
Maha H. A. Hussain ◽  
Cora N. Sternberg ◽  
Eleni Efstathiou ◽  
Karim Fizazi ◽  
Qi Shen ◽  
...  

94 Background: The PROSPER trial demonstrated prolonged MFS and OS for men with nmCRPC and rapidly rising PSA treated with ENZA vs placebo, both in combination with androgen deprivation therapy (ADT). The final survival analysis of PROSPER (Sternberg et al. NEJM 2020) recently reported a median OS of 67.0 months (95% CI, 64.0 to not reached) with ENZA and 56.3 months (95% CI, 54.4 to 63.0) with placebo (hazard ratio [HR] for death, 0.73; 95% CI, 0.61 to 0.89; P = .001). Post hoc analyses of PROSPER evaluating PSA dynamics have demonstrated longer MFS with greater PSA decline (Hussain et al. ESMO Sept 19-21, 2020. Poster 685P) and increased risk of metastases in patients with even modest PSA progression vs those without (Saad et al. Eur Urol 2020). Here we further explored the relationship between PSA dynamics and outcomes in PROSPER using uniquely defined PSA subgroups of decline. Methods: Eligible men in PROSPER had nmCRPC, a PSA level ≥ 2 ng/mL at baseline, and a PSA doubling time ≤ 10 months. Men continued ADT, were randomized 2:1 to ENZA 160 mg once daily vs placebo, and had PSA evaluation at week 17 and every 16 weeks thereafter. This post hoc analysis evaluated OS and MFS for 4 mutually exclusive subgroups defined by PSA nadir using men with PSA reduction < 50% as the reference group. The HR is based on an unstratified Cox proportional hazards analysis model. Results: 1401 men were enrolled in PROSPER; 933 were treated with ENZA and PSA data were available for 905. Measured at nadir, 38% of these men achieved PSA reduction ≥ 90% (actual nadir < 0.2 ng/mL), and another 27% achieved PSA reduction ≥ 90% (actual nadir ≥ 0.2 ng/mL). Among men in the placebo arm of PROSPER only 3/457 reported PSA reduction ≥ 90%. Median OS and MFS increased with increasing depth of PSA decline (Table). Conclusions: In men with nmCRPC and rapidly rising PSA treated with ADT plus ENZA, there was a close relationship between the degree of PSA decline and survival outcomes. Defining PSA by both percent decline and actual decline below 0.2 ng/mL revealed a previously under-appreciated relationship between these PSA metrics and highlights the importance of PSA nadir as an intermediate biomarker in nmCRPC. Clinical trial information: NCT02003924. [Table: see text]


PLoS ONE ◽  
2021 ◽  
Vol 16 (9) ◽  
pp. e0256899
Author(s):  
Samuel K. Ayeh ◽  
Enoch J. Abbey ◽  
Banda A. A. Khalifa ◽  
Richard D. Nudotor ◽  
Albert Danso Osei ◽  
...  

Background There is an urgent need for novel therapeutic strategies for reversing COVID-19-related lung inflammation. Recent evidence has demonstrated that the cholesterol-lowering agents, statins, are associated with reduced mortality in patients with various respiratory infections. We sought to investigate the relationship between statin use and COVID-19 disease severity in hospitalized patients. Methods A retrospective analysis of COVID-19 patients admitted to the Johns Hopkins Medical Institutions between March 1, 2020 and June 30, 2020 was performed. The outcomes of interest were mortality and severe COVID-19 infection, as defined by prolonged hospital stay (≥ 7 days) and/ or invasive mechanical ventilation. Logistic regression, Cox proportional hazards regression and propensity score matching were used to obtain both univariable and multivariable associations between covariates and outcomes in addition to the average treatment effect of statin use. Results Of the 4,447 patients who met our inclusion criteria, 594 (13.4%) patients were exposed to statins on admission, of which 340 (57.2%) were male. The mean age was higher in statin users compared to non-users [64.9 ± 13.4 vs. 45.5 ± 16.6 years, p <0.001]. The average treatment effect of statin use on COVID-19-related mortality was RR = 1.00 (95% CI: 0.99–1.01, p = 0.928), while its effect on severe COVID-19 infection was RR = 1.18 (95% CI: 1.11–1.27, p <0.001). Conclusion Statin use was not associated with altered mortality, but with an 18% increased risk of severe COVID-19 infection.


Circulation ◽  
2016 ◽  
Vol 133 (suppl_1) ◽  
Author(s):  
Xi Zhang ◽  
Jin Xia ◽  
Liana C. Del Gobbo ◽  
Adela Hruby ◽  
Ka He ◽  
...  

Introduction: Low magnesium (Mg) intake and/or status has been associated with increased risk of chronic disease, including cardiovascular disease (CVD) and cancer. However, whether and to what extent low serum Mg levels are associated with all-cause or cause-specific mortality in the general population is uncertain. Hypothesis: We aimed to quantify the dose-response associations between low concentrations of serum Mg and mortality from all causes, cancer, CVD, and stroke in the general US population. Methods: We analyzed prospective data on 14,353 participants aged 25-74 years with baseline measures of serum Mg concentrations from the National Health and Nutrition Examination Survey Epidemiologic Follow-up Study 1971-2006. We estimated the mortality hazard ratios (HRs) for participants within predefined and clinically meaningful categories of serum Mg levels, including <0.7, 0.7-0.74, 0.75-0.79, 0.8-0.9 (normal reference), 0.9-0.94, 0.95-0.99, and ≥1.0 mmol/L, using Cox proportional hazards models. Restricted cubic spline models were applied to examine potentially nonlinear relationships between serum Mg and mortality. Results: During a mean follow-up of 27.6 years, 7,072 deaths occurred, 3,310 (47%) CVD deaths, 1,533 (22%) cancer deaths, and 281 (4%) stroke deaths. Twenty-one percent of all participants had low levels of serum Mg (<0.8 mmol/L) and 1.5% had extremely low serum Mg (<0.7 mmol/L). Age-adjusted all-cause mortality rates were 3845, 3491, 3471, 3400 (normal reference), 3531, 3525, and 3836 per 100,000 person-years for increasing categories of serum Mg; the HRs and 95% confidence intervals for increasing serum Mg were 1.32 (1.02-1.72), 0.93 (0.74-1.16), and 1.06 (0.96-1.18), 1.07 (0.97-1.18), 0.94 (0.77-1.13), and 0.93 (0.72-1.21), compared to the reference group (0.8-0.9 mmol/L). An L-shaped association between serum Mg concentrations and all-cause mortality was observed after adjusting for potential confounders (Figure). No statistically significant associations were observed between serum Mg and cancer, CVD, or stroke mortality. Conclusions: Very low serum Mg levels were significantly associated with all-cause mortality in the general US population. Our findings support the hypothesis that Mg deficiency as defined by very low serum Mg may have an important influence on mortality.


2020 ◽  
Author(s):  
Jingyi Lu ◽  
Chunfang Wang ◽  
Yun Shen ◽  
Lei Chen ◽  
Lei Zhang ◽  
...  

<b>Objective: </b>There is growing evidence linking time in range (TIR), an emerging metric for assessing glycemic control, to diabetes-related outcomes. We aimed to investigate the association between TIR and mortality in patients with type 2 diabetes. <p><b>Research design and methods: </b>A total of 6225 adult patients with type 2 diabetes were included from January 2005 to December 2015 from a single center in Shanghai, China. TIR was measured with continuous glucose monitoring at baseline, and the participants were stratified into 4 groups by TIR: >85%, 71-85%, 51-70%, and ≤50%. Cox proportional hazards regression models were used to estimate the association between different levels of TIR and the risks of all-cause and cardiovascular disease (CVD) mortality.</p> <p><b>Results: </b>The mean age of the participants was 61.7 years at baseline. During a median follow-up of 6.9 years, 838 deaths were identified, 287 of which were due to CVD. The multivariable-adjusted hazard ratios associated with different levels of TIR (>85% [reference group], 71-85%, 51-70%, and ≤50%) were 1.00, 1.23 (95% CI, 0.98-1.55), 1.30 (95% CI, 1.04-1.63), and 1.83 (95% CI, 1.48-2.28) for all-cause mortality (P for trend <0.001), and 1.00, 1.35 (95% CI, 0.90-2.04), 1.47 (95% CI, 0.99-2.19), and 1.85 (95% CI, 1.25-2.72) for CVD mortality (P for trend =0.015), respectively. </p> <p><b>Conclusions: </b>The present study indicated an association of lower TIR with an increased risk of all-cause and CVD mortality among patients with type 2 diabetes, supporting the validity of TIR as a surrogate marker of long-term adverse clinical outcomes.<b></b></p> <p> </p>


2021 ◽  
Vol 11 ◽  
Author(s):  
Chen-Yi Liao ◽  
Chi-Hsiang Chung ◽  
Kuo-Cheng Lu ◽  
Cheng-Yi Cheng ◽  
Sung-Sen Yang ◽  
...  

Background: Sleeping disorder has been associated with chronic kidney disease (CKD); however, the correlation between sleeping pills use and CKD has not been investigated in-depth yet. This study elucidated the potential association of sleeping pill use with the risk of CKD and CKD progression to end-stage renal disease (ESRD) requiring dialysis.Methods: This study was based on a population-based cohort that included 209,755 sleeping pill users among 989,753 individuals. After applying the exclusion criteria, 186,654 sleeping pill users and 373,308 nonusers were enrolled to monitor the occurrence of CKD. Using a cumulative daily dose, we analyzed the types of sleeping pills related to the risk of CKD and ESRD. Propensity score matching and analysis using Cox proportional hazards regression were performed with adjustments for sex, age, and comorbidities.Results: Sleeping pill use was related to increased CKD risk after adjusting for underlying comorbidities (adjusted hazard ratio [aHR] = 1.806, 95% confidence interval [CI]: 1.617–2.105, p &lt; 0.001). With the exception of hyperlipidemia, most comorbidities correlated with an increased risk of CKD. Persistent use of sleeping pills after CKD diagnosis increased the risk of concurrent ESRD (aHR = 7.542; 95% CI: 4.267–10.156; p &lt; 0.001). After the subgroup analysis for sleeping pill use, brotizolam (p = 0.046), chlordiazepoxide (p &lt; 0.001), clonazepam (p &lt; 0.001), diazepam (p &lt; 0.001), dormicum (p &lt; 0.001), estazolam (p &lt; 0.001), fludiazepam (p &lt; 0.001), flunitrazepam (p &lt; 0.001), nitrazepam (p &lt; 0.001), trazodone (p &lt; 0.001), zolpidem (p &lt; 0.001), and zopiclone (p &lt; 0.001) were found to have significant correlation with increased CKD risk.Conclusion: Sleeping pill use was related to an increased risk of CKD and ESRD. Further studies are necessary to corroborate these findings.


Author(s):  
Peir‐Haur Hung ◽  
Chih‐Ching Yeh ◽  
Chih‐Yen Hsiao ◽  
Chih‐Hsin Muo ◽  
Kuan‐Yu Hung ◽  
...  

Background Targeting higher hemoglobin levels with erythropoietin to treat anemia in patients with chronic kidney disease is associated with increased cardiovascular risk, including that of stroke. The risks of the subtypes of stroke, ischemic, hemorrhagic, and unspecified, following the administration of erythropoietin in patients with end‐stage renal disease receiving hemodialysis remain unclear. Methods and results Overall, 12 948 adult patients with end‐stage renal disease treated during 1999 to 2010 who had undergone hemodialysis were included. The study end points were the incidences of stroke and its subtypes. We used Cox proportional hazards regression models to estimate hazard ratios (HRs) of stroke and its subtypes in erythropoietin recipients compared with nonrecipients. Patients in the erythropoietin cohort did not have an increased risk of stroke compared with those in the nonerythropoietin cohort (adjusted HR, 1.03; 95% CI, 0.92–1.15). Compared with patients in the nonerythropoietin cohort, the risks of ischemic, hemorrhagic, or unspecified stroke were not higher in patients in the erythropoietin cohort (adjusted HRs, 1.08 [95% CI, 0.93–1.26], 0.96 [95% CI, 0.78–1.18], and 1.03 [95% CI, 0.80–1.32], respectively). Increased risks of stroke and its subtypes were not observed with even large annual defined daily doses of erythropoietin (>201). Conclusions Erythropoietin in patients receiving hemodialysis is not associated with increased risk of stroke or any of its subtypes.


Circulation ◽  
2014 ◽  
Vol 129 (suppl_1) ◽  
Author(s):  
Lyanne M Kieneker ◽  
Ron T Gansevoort ◽  
Edith J Feskens ◽  
Johanna M Geleijnse ◽  
Gerjan Navis ◽  
...  

Background: Potassium supplementation lowers blood pressure (BP) in randomized controlled trials, but the long-term effect of dietary potassium intake on risk of hypertension has not yet been established. Objective: To examine the association of 24h urinary excretions of potassium, reflecting dietary uptake, with risk of hypertension. Methods: We used data from the Prevention of Renal and Vascular End-Stage Disease (PREVEND) study, a prospective, community-based, observational cohort of Dutch men and women aged 28-75 years. Potassium excretion was measured at baseline (1997-98) and during follow-up (2001-03) in two consecutive 24h urine specimens. Risk of hypertension (defined as BP ≥140/90 mmHg, or initiation of BP-lowering drugs) was studied in 5,511 normotensive subjects not using BP-lowering drugs at baseline. We used Cox proportional hazards regression analysis with time-dependent covariates. Results: Baseline median potassium excretion was 72 mmol/24h (Q1-Q3: 57-85 mmol/24h). During a median follow-up of 7.6 years (Q1-Q3: 5.0-9.3 years), 1172 subjects developed hypertension. We observed a nonlinear association between potassium excretion and risk of hypertension (P=0.005; Figure ). This association was in such a way that the lowest sex-specific tertile of potassium excretion (men: <68 mmol/24h; women: <58 mmol/24h) had an increased risk of hypertension (hazard ratio [HR], 1.22; 95% confidence interval [CI], 1.08-1.37) after adjustment for age and sex, compared to the upper two tertiles. Further adjustment for body mass index, smoking status, alcohol intake, parental history of hypertension (HR, 1.25; 95% CI, 1.11-1.41), and additionally for 24h urinary excretions of sodium, magnesium, and calcium (HR, 1.23; 95% CI, 1.08-1.40) did not materially affect the association. Conclusions: In this population-based cohort, low potassium excretion was associated with an increased risk of developing hypertension. Figure: Association between 24h urinary potassium excretion and risk of hypertension.


Author(s):  
Chun-Gu Cheng ◽  
Hsin Chu ◽  
Jiunn-Tay Lee ◽  
Wu-Chien Chien ◽  
Chun-An Cheng

(1) Background: Patients with benign prostatic hyperplasia (BPH) were questioned about quality of life and sleep. Most BPH patients were treated with alpha-1 adrenergic receptor antagonists, which could improve cerebral blood flow for 1–2 months. Patients with ischemic stroke (IS) could experience cerebral autoregulation impairment for six months. The relationship between BPH and recurrent IS remains unclear. The aim of this study was to determine the risk of one-year recurrent IS conferred by BPH. (2) Methods: We used data from the Taiwanese National Health Insurance Database to identify newly diagnosed IS cases entered from 1 January 2008 to 31 December 2008. Patients were followed until the recurrent IS event or 365 days after the first hospitalization. The risk factors associated with one-year recurrent IS were assessed using Cox proportional hazards regression. (3) Results: Patients with BPH had a higher risk of recurrent IS (12.11% versus 8.15%) (adjusted hazard ratio (HR): 1.352; 95% confidence interval (CI): 1.028–1.78, p = 0.031). Other risk factors included hyperlipidemia (adjusted HR: 1.338; 95% CI: 1.022–1.751, p = 0.034), coronary artery disease (adjusted HR: 1.487; 95% CI: 1.128–1.961, p = 0.005), chronic obstructive pulmonary disease (adjusted HR: 1.499; 95% CI: 1.075–2.091, p = 0.017), and chronic kidney disease (adjusted HR: 1.523; 95% CI: 1.033–2.244, p = 0.033). (4) Conclusion: Patients with BPH who had these risk factors had an increased risk of one-year recurrent IS. The modification of risk factors may prevent recurrent IS.


Sign in / Sign up

Export Citation Format

Share Document