scholarly journals The effect of dietary phosphorus load and food matrix on postprandial serum phosphate in hemodialysis patients: a pilot study

2021 ◽  
Vol 4 ◽  
pp. 119
Author(s):  
Fiona Byrne ◽  
Barbara Gillman ◽  
Brendan Palmer ◽  
Mairead Kiely ◽  
Joseph Eustace ◽  
...  

Background: Potential dietary strategies for controlling hyperphosphataemia include the use of protein sources with lower phosphorus bioavailability such as pulses and nuts, focus on phosphorus to protein ratios and the avoidance of all phosphate additives. Methods: We conducted a controlled crossover feeding study in 8 haemodialysis (HD) patients to investigate the acute postprandial effect of a modified versus standard low phosphorus diet for one day on serum phosphate, potassium and intact parathyroid levels in prevalent HD patients. Each participant consumed the modified diet on one day and the standard diet on a second day one week apart. The modified diet included beef and less dairy, with a lower phosphorus to protein ratio, as well as plant-based protein, whole grains, pulses and nuts containing phytates which reduces phosphorus bioavailability. Both diets were tailored for each participant to provide 1.1g protein/kg ideal body weight. Participants provided fasting bloods before breakfast, a pre-prandial sample before the lunch time main meal and samples at one-hour intervals for the four hours after the lunch time main meal, for analysis of phosphate, potassium and intact parathyroid hormone (iPTH). Results: At four hours post the lunch time main meal on each study day, individuals on the modified diet had serum phosphate readings 0.30 mmol/l lower than when on the standard diet (p-value = 0.015, 95% confidence interval [CI] -0.57, -0.04). The corresponding change in serum potassium at four hours was a decrease of 0.675 mmol/l (p-value = 0.011, CI -1.25, -0.10). Conclusions: Decreases in both serum phosphate and serum potassium readings on a modified low phosphorus diet encourage further larger studies to explore the possibility of greater food choice and healthier plant-based diets in HD patients.  ClinicalTrials.gov registration: NCT04845724 (15/04/2021)

2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Song Wang ◽  
Xinkui Tian ◽  
Xinhong Lu ◽  
Hanmin Liu ◽  
YUE WANG

Abstract Background and Aims To investigate the effect of changing dietary phosphorus-protein ratio on the nutritional status of maintenance hemodialysis patients. Method Hemodialysis patients who had average serum phosphorus ≥1.78 mmol/L for three consecutive months were enrolled in this self-controlled trial. Patients received low phosphorus diet instruction for 4 weeks as baseline, followed by changed the diet of staple foods with the same amount of low protein rice for 10 weeks. The difference of protein intake between the low protein rice and staple foods was replaced by low phosphorus whey. Then the patients reverted to staple food for 8 weeks. Serum phosphorus, albumin and nutritional status before and after dietary changes were observed and analyzed. Results 29 patients completed the study. At baseline dietary phosphorus-protein ratio was (15.85±3.29) mg/g, and serum phosphorus was (2.05±0.32) mmol/L. After switch to low protein rice and low phosphorus whey for 10 weeks, dietary phosphorus-protein ratio decreased to (12.18±2.45) mg/g (p<0.001). Simultaneously serum phosphorus decreased to (1.87±0.44) mmol/L (p=0.048). Nutritional status evaluation showed the patients’ serum albumin increased significantly compared with baseline (42.29±3.51 vs. 39.84±3.23 g/L, p<0.001), as well as dry body weight (65.09±15.30 vs. 64.71 ± 15.07 kg, p=0.030), upper arm muscle circumference (22.57±2.83 vs 22.00±3.03 cm, p=0.013), and the grip strength (27.89±7.82 vs. 26.54±7.90 kg, p=0.032). Subjective global assessment (SGA) and serum lipid level did not change significantly after the food change. Conclusion For hemodialysis patients, changing dietary phosphorus-protein ratio by combination of low protein rice and low phosphorus whey could effectively decrease serum phosphorus level, and improve serum albumin, as well as muscle mass and muscle strength.


2021 ◽  
Vol 9 (1) ◽  
pp. 71-86
Author(s):  
Armiyani Armiyani ◽  
Susyani Susyani ◽  
Imelda Telisa

Chronic kidney disease (CKD) causes metabolic disorders such as hyperphosphatemia. Limiting phosphorus and protein intake is one option, but egg whites and brown rice snack bars are another. This study aimed to compare CKD patients' mean blood urea levels before and after intervention at Prabumulih City Hospital in South Sumatra. This study employs a two-stage quasy design. The first stage of making snack bars was organoleptic tests with Friedman test analysis. In Phase II, patients are given snack bars, and their blood urea levels are measured using t-dependent tests. Purposive sampling was used to select CKD patients from a group of up to 13. The best formula for the brown rice snack bar was formula 3 with 225 kcal energy, 3.46 g protein, 12.68 g fat, 24.26g carbohydrate, 38.92 g phosphorus, and a Phosphorus-Protein ratio of 11.24 mg/g. The mean blood urea levels of CKD patients differed statistically (p-value 0.000). With its low protein and low phosphorus-to-protein ratio, the snack bar's formula can lower blood urea levels in CKD patients. So this snack bar can be used as CKD diet food.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
David Ray Chang ◽  
Hung-Chieh Yeh ◽  
I-Wen Ting ◽  
Chen-Yuan Lin ◽  
Han-Chun Huang ◽  
...  

AbstractThe role of the difference and ratio of albuminuria (urine albumin-to-creatinine ratio, uACR) and proteinuria (urine protein-to-creatinine ratio, uPCR) has not been systematically evaluated with all-cause mortality. We retrospectively analyzed 2904 patients with concurrently measured uACR and uPCR from the same urine specimen in a tertiary hospital in Taiwan. The urinary albumin-to-protein ratio (uAPR) was derived by dividing uACR by uPCR, whereas urinary non-albumin protein (uNAP) was calculated by subtracting uACR from uPCR. Conventional severity categories of uACR and uPCR were also used to establish a concordance matrix and develop a corresponding risk matrix. The median age at enrollment was 58.6 years (interquartile range 45.4–70.8). During the 12,391 person-years of follow-up, 657 deaths occurred. For each doubling increase in uPCR, uACR, and uNAP, the adjusted hazard ratios (aHRs) of all-cause mortality were 1.29 (95% confidence interval [CI] 1.24–1.35), 1.12 (1.09–1.16), and 1.41 (1.34–1.49), respectively. For each 10% increase in uAPR, it was 1.02 (95% CI 0.98–1.06). The linear dose–response association with all-cause mortality was only observed with uPCR and uNAP. The 3 × 3 risk matrices revealed that patients with severe proteinuria and normal albuminuria had the highest risk of all-cause mortality (aHR 5.25, 95% CI 1.88, 14.63). uNAP significantly improved the discriminative performance compared to that of uPCR (c statistics: 0.834 vs. 0.828, p-value = 0.032). Our study findings advocate for simultaneous measurements of uPCR and uACR in daily practice to derive uAPR and uNAP, which can provide a better mortality prognostic assessment.


2018 ◽  
Vol 8 (1) ◽  
Author(s):  
Wan-Chuan Tsai ◽  
Yu-Sen Peng ◽  
Hon-Yen Wu ◽  
Shih-Ping Hsu ◽  
Yen-Ling Chiu ◽  
...  

Author(s):  
Gianmarco Lombardi ◽  
Giovanni Gambaro ◽  
Pietro Manuel Ferraro

Introduction Electrolytes disorders are common findings in kidney diseases and might represent a useful biomarker preceding kidney injury. Serum potassium [K+] imbalance is still poorly investigated for association with acute kidney injury (AKI) and most evidence come from intensive care units (ICU). The aim of our study was to comprehensively investigate this association in a large, unselected cohort of hospitalized patients. Methods: We performed a retrospective observational cohort study on the inpatient population admitted to Fondazione Policlinico Universitario A. Gemelli IRCCS between January 1, 2010 and December 31, 2014 with inclusion of adult patients with at least 2 [K+] and 3 serum creatinine (sCr) measurements who did not develop AKI during an initial 10-day window. The outcome of interest was in-hospital AKI. The exposures of interest were [K+] fluctuations and hypo (HoK) and hyperkalemia (HerK). [K+] variability was evaluated using the coefficient of variation (CV). Cox proportional hazards regression models were used to obtain hazard ratios (HRs) and 95% confidence intervals (CIs) of the association between the exposures of interest and development of AKI. Results: 21,830 hospital admissions from 18,836 patients were included in our study. During a median follow-up of 5 (interquartile range [IQR] 7) days, AKI was observed in 555 hospital admissions (2.9%); median time for AKI development was 5 (IQR 7) days. Higher [K+] variability was independently associated with increased risk of AKI with a statistically significant linear trend across groups (p-value = 0.012). A significantly higher incidence of AKI was documented in patients with HerK compared with normokalemia. No statistically significant difference was observed between HoK and HerK (p-value = 0.92). Conclusion: [K+] abnormalities including fluctuations even within the normal range are associated with development of AKI.


2017 ◽  
Vol 4 (2) ◽  
pp. 457
Author(s):  
Sujatha G. ◽  
Vindhya P. ◽  
Kalyan Kumar K.

Background: Approximately one million patients develop pleural effusion every year. It is a common clinical disorder and is either a manifestation or a complication of one or other respiratory or non-respiratory disorders. It leads to serious prognosis, if not diagnosed and treated properly. To calculate SEAG and Light’s criteria and to compare SEAG with Light’s criteria in analyzing pleural effusions.Methods: A total of hundred patients were selected for the study. Pleural fluid of patients who met the inclusion and exclusion criteria were collected, when pleural fluid is being tapped for diagnostic thoracocentesis. Venous blood sample was collected along with diagnostic thoracocentesis or within 24 hours of thoracocentesis.  Written informed consent was obtained from them for thoracocentesis.Results: In our study we compared the clinical outcome with outcome as per Pleural fluid/Serum protein ratio (p value of <0.0001), pleural fluid/serum LDH (p value of <0.0001) and pleural fluid LDH (p value of <0.0001) separately and the p values were statistically significant. The sensitivity, specificity, PPV and NPV of Light’s criteria were 77.2%, 100%, 100%, 93.9% respectively. We compared Light’s criteria outcome with clinical outcome and the difference was statistically significant (p value of <0.0001). SEAG showed 100% sensitivity, 97.43% specificity, 91.6% PPV and is 91.66% and NPV is 100%. We compared the clinical outcome with SEAG and there was statistically significant difference (p value of <0.0001). We compared SEAG with Light’s criteria and the difference was statistically significant (p <0.0001). We compared Light’s plus pleural fluid protein gradient with SEAG and the difference is statistically significant (p value of <0.0001).Conclusions: SEAG is more sensitive for classifying transudates and more specific for exudates than Light’s criteria.


2021 ◽  
Vol 15 (9) ◽  
pp. 2952-2954
Author(s):  
Roomisa Anis ◽  
Misbah-ul- Qamar ◽  
Ayesha Shafqat ◽  
Ayesha Aftab ◽  
Zarafshan Bader ◽  
...  

Lead (Pb) is an abundant and one of the most lethal metals found in the earth’s crust. Its use by humans dates back to thousands of year. Even the low doses of lead are responsible for the production of reactive oxygen species which leads to oxidative load. This oxidative stress mitigates production of malondialdehyde (MDA) and down regulates antioxidant enzyme superoxide dismutase (SOD). Study Design: Quasi experimental Study. Place and duration of study: Department of Biochemistry, ANMCH, Islamabad, Pakistan in collaboration with NIH, Islamabad from November, 2018 to April, 2019. Methodology: A total of 40 BALB/c mice were divided into two groups of 20 mice each. Group  was given normal standard diet. Group  was given lead acetate in drinking water with normal diet without any supplementation. Levels of malondialdehyde were measured by using Thiobarbituric acid reactive substances (TBARS) and Superoxide Dismutase (SOD) was estimated by xanthine oxidase method at the end of study. Results: The results of our study showed increase in MDA and decrease in SOD in lead treated group when compared with the control group. Pearson correlation was applied to assess the degree of association between two parameters, it showed significant negative correlation with value of r = -0.96 and p-value of 0.001 Conclusion: It was concluded from our study that increase in MDA leads to decrease in SOD indicating strong negative correlation in lead poisoned mice. Key words: Lead poisoning, Malondialdehyde, Oxidative Stress, Superoxide Dismutase


2020 ◽  
Vol 4 (Supplement_2) ◽  
pp. 341-341
Author(s):  
Sowmiya Muthuraju ◽  
Derek Miketinas

Abstract Objectives Patients with liver conditions may have increased phosphorus turnover which can increase the risk of severe hypophosphatemia and other complications. The objective of this cross-sectional study was to quantify the usual intake of phosphorus, assess serum phosphorus (SP) levels across levels of liver conditions, and to estimate and assess the odds for having critically low phosphorus levels across adults with and without liver conditions. Methods Data were obtained from the NHANES 2015–2016 cycle. Adults were divided into four groups based on self-reported responses from the NHANES medical history questionnaire: liver cancer (LC), unspecified current liver condition (CLC), unspecified resolved liver condition (RLC), and no liver condition. Usual intake was estimated using the NCI method and all analyses were adjusted to account for the complex, multistage, probability sampling design. Results Usual phosphorus intake was highest in participants with RLC (1399 ± 26.5 mg) and lowest in participants with LC (1267 ± 140.7 mg). Although the percentage of those meeting the EAR for phosphorus was high (&gt;95%), SP levels are lowest in participants with LC. SP levels differed slightly across liver conditions: participants with LC had a SP level of 1.0 ± 0.07 mmol/L, while participants with CLC, RLC, or no liver conditions had SP levels of 1.2 ± 0.01 mmol/L, 1.2 ± 0.01 mmol/L, and 1.2 ± 0.02 mmol/L, respectively. Participants with CLC had a usual phosphorus intake of 1350 ± 49.6 mg, and those who had no liver conditions had a usual phosphorus intake of 1387 ± 18.5 mg. The odds for normal phosphorus levels in participants with LC was low (Odds = 0.06; 95% CI: 0.01–0.45); the odds for CLC participants having normal SP levels was 1.6 (95% CI: 1.2–2.15); the odds for normal SP levels in participants with RLC were 2.2 (95% CI: 1.3–3.75), and the odds for normal SP in participants with no liver conditions odds for low were 1.9, (95% CI: 1.71–2.14). Conclusions These results indicate that patients with liver cancer are at higher risk of hypophosphatemia, and that phosphorus recommendations for patients with liver cancer may need to be adjusted. However, the variability in this subpopulation with liver cancer is high and warrants further investigation. Funding Sources None.


Author(s):  
Eileen Suk Ying Ng ◽  
Poh Yoong Wong ◽  
Ahmad Teguh Hakiki Kamaruddin ◽  
Christopher Thiam Seong Lim ◽  
Yoke Mun Chan

Despite optimal control of serum phosphate level being imperative to avoid undesirable health outcomes, hyperphosphataemia is a highly prevalent mineral abnormality among the dialysis population. This study aimed to determine factors associated with hyperphosphatemia among hemodialysis patients in Malaysia. Multiple linear regression analysis was used to ascertain the possible factors that influence serum phosphate levels. A total of 217 hemodialysis patients were recruited. Hyperphosphatemia was prevalent. Only approximately 25% of the patients were aware that optimal control of hyperphosphatemia requires the combined effort of phosphate binder medication therapy, dietary restriction, and dialysis prescription. The presence of diabetes mellitus may affect serum phosphate levels, complicating dietary phosphorus management. Patients who were less depressive portrayed higher serum phosphate levels, implying intentional non-compliance. Better compliance on phosphate binder, longer sleep duration, and higher social support was associated with a lower level of serum phosphate. Despite sleep disturbance being one of the most prevalent and intense symptom burdens identified by hemodialysis patients, relatively few studies have addressed this issue. It is time to formulate sleep therapeutic interventions besides the encouragement of strong social support, hoping which many clinical outcomes including hyperphosphatemia can be better controlled among hemodialysis patients.


Sign in / Sign up

Export Citation Format

Share Document