scholarly journals Improved Growth in Young Children with Severe Chronic Renal Insufficiency Who Use Specified Nutritional Therapy

2001 ◽  
Vol 12 (11) ◽  
pp. 2418-2426 ◽  
Author(s):  
RULAN S. PAREKH ◽  
JOSEPH T. FLYNN ◽  
WILLIAM E. SMOYER ◽  
JOAN L. MILNE ◽  
DAVID B. KERSHAW ◽  
...  

Abstract. Growth in children with chronic renal failure caused by polyuric, salt-wasting diseases may be hampered if ongoing sodium and water losses are not corrected. Twenty-four children were treated with polyuric chronic renal insufficiency (CRI; creatinine clearance <65 ml/min per 1.73 m2) with low-caloric-density, high-volume, sodium-supplemented feedings. Subsequent growth was compared with that of children in two control groups: a national historic population control from the US Renal Data System database (n = 42), and a literature control (n = 12). Members of the three groups were 81 to 96% white, and 58 to 70% were boys. Obstructive uropathy and dysplasia were the cause of CRI in 92% of the treatment group, 75% of the literature control group, and 30% of the population control group. Treatment effect was assessed in a multivariate, retrospective analysis of the height standard deviation score (SDS), simultaneously controlling for the severity of disease by renal replacement therapy, primary cause of CRI, and initial height SDS. The change in SDS (ΔSDS) for height by regression analysis at 1 yr was significantly greater by +1.37 in the treatment group versus the population control (P = 0.017). The 2-yr height ΔSDS by regression analysis adjusted for creatinine clearance was significantly greater by +1.83 in the treatment group versus the literature control (P = 0.003). Nutritional support with sodium and water supplementation can maintain or improve the growth of children with polyuric, salt-wasting CRI. This inexpensive intervention may delay the need for renal replacement therapy, growth hormone treatment, or both in many of these children and may be used in any clinical setting.

2016 ◽  
Vol 19 (3) ◽  
pp. 123 ◽  
Author(s):  
Orhan Findik ◽  
Ufuk Aydin ◽  
Ozgur Baris ◽  
Hakan Parlar ◽  
Gokcen Atilboz Alagoz ◽  
...  

<strong>Background:</strong> Acute kidney injury is a common complication of cardiac surgery that increases morbidity and mortality. The aim of the present study is to analyze the association of preoperative serum albumin levels with acute kidney injury and the requirement of renal replacement therapy after isolated coronary artery bypass graft surgery (CABG).<br /><strong>Methods:</strong> We retrospectively reviewed the prospectively collected data of 530 adult patients who underwent isolated CABG surgery with normal renal function. The perioperative clinical data of the patients included demographic data, laboratory data, length of stay, in-hospital complications and mortality. The patient population was divided into two groups: group I patients with preoperative serum albumin levels &lt;3.5 mg/dL; and group II pateints with preoperative serum albumin levels ≥3.5 mg/dL.<br /><strong>Results:</strong> There were 413 patients in group I and 117 patients in group II. Postoperative acute kidney injury (AKI) occured in 33 patients (28.2%) in group I and in 79 patients (19.1%) in group II. Renal replacement therapy was required in 17 patients (3.2%) (8 patients from group I; 9 patients from group II; P = .018). 30-day mortality occurred in 18 patients (3.4%) (10 patients from group I; 8 patients from group II; P = .037). Fourteen of these patients required renal replacement therapy. Logistic regression analysis revealing the presence of lower serum albumin levels preoperatively was shown to be associated with increased incidence of postoperative AKI (OR: 1.661; 95% CI: 1.037-2.661; <br />P = .035). Logistic regression analysis also revealed that DM (OR: 3.325; 95% CI: 2.162-5.114; P = .000) was another independent risk factor for AKI after isolated CABG. <br /><strong>Conclusion:</strong> Low preoperative serum albumin levels result in severe acute kidney injury and increase the rate of renal replacement therapy and mortality after isolated CABG.


2021 ◽  
Author(s):  
Anna Buckenmayer ◽  
Lotte Dahmen ◽  
Joachim Hoyer ◽  
Sahana Kamalanabhaiah ◽  
Christian S. Haas

Abstract Background: The erythrocyte sedimentation rate (ESR) is a simple laboratory diagnostic tool for estimating systemic inflammation. It remains unclear, if renal function affects ESR, thereby compromising its validity. This pilot study aims to compare prevalence and extent of ESR elevations in hospitalized patients with or without kidney disease. In addition, the impact of renal replacement therapy (RRT) modality on ESR was determined.Methods: In this single-center, retrospective study, patients were screened for ESR values. ESR was compared in patients with and without renal disease and/or RRT. In addition, ESR was correlated with other inflammatory markers, the extent of renal insufficiency and clinical characteristics.Results: A total of 203 patients was identified, showing an overall elevated ESR in the study population (mean 51.7±34.6 mm/h). ESR was significantly increased in all patients with severe infection, active vasculitis or cancer, respectively, independent from renal function. Interestingly, there was no difference in ESR between patients with and without kidney disease or those having received a prior renal transplant or being on hemodialysis. However, ESRD patients treated with peritoneal dialysis presented with a significantly higher ESR (78.3±33.1 mm/h, p<0.001), while correlation with other inflammatory markers was not persuasive.Conclusions: We showed that ESR: (1) does not differ between various stages of renal insufficiency; (2) may be helpful as a screening tool also in patients with renal insufficiency; and (3) is significantly increased in ESRD patients on peritoneal dialysis per se, while it seems not to be affected by hemodialysis or renal transplantation (see graphical abstract as supplementary material).


2021 ◽  
Vol 38 (5) ◽  
pp. 115-122
Author(s):  
Kazim G. Gasanov ◽  
Viktor A. Zurnadzhyants ◽  
Eldar A. Kchibekov ◽  
M. I. Shikhragimov

Objective. To determine the blood serum 2-microglobulin and 2-macroglobulin concentration in patients undergoing renal replacement therapy (programmed hemodialysis) for the diagnosis of uremic pancreatitis and / or destructive pancreatitis. Materials and methods. The study involved 52 patients admitted to the Surgical Unit of Astrakhan "RZhD-Medicine" Hospital and City Clinical Hospital № 3. The blood serum 2-microglobulin and 2-macroglobulin concentration was analyzed in patients admitted on an emergency basis with suspicion of uremic pancreatitis and destructive pancreatitis, who receive renal replacement therapy (programmed hemodialysis). The control group included 50 outpatients undergoing renal replacement therapy (programmed hemodialysis). The study did not include patients with suspected pancreatitis who were not receiving renal replacement therapy. The period of the study is 20192021. Results. The concentration of blood serum 2-microglobulin is statistically higher than normal in all patients, who had received renal replacement therapy (programmed hemodialysis) in anamnesis. The most statistically high concentration of 2-microglobulin was revealed while studying patients with uremic pancreatitis (n = 34), and was (30.0 2.75 mg/l) compared with the blood serum concentration in patients with destructive pancreatitis (8 0.51 mg / l). The concentration of 2-macroglobulin was statistically lower in destructive pancreatitis (n = 18) and was 615 161 mg/l compared with uremic pancreatitis (980 216 mg/l). In the control group of outpatients (n = 50) receiving renal replacement therapy (programmed hemodialysis), no statistically significant blood serum concentrations of 2-microglobulin and 2-macroglobulin were found. Conclusions. A clear dependence of the concentration of 2-microglobulin and 2-macroglobulin on the severity of uremic pancreatitis and destructive pancreatitis was established. Statistically high values of 2-microglobulin concentrations were obtained in patients with uremic pancreatitis, and the 2-macroglobulin level was statistically low in destructive pancreatitis.


2017 ◽  
Vol 37 (1) ◽  
pp. 15-25 ◽  
Author(s):  
Deborah McKenzie ◽  
Tony Xing Tan ◽  
Edward C. Fletcher ◽  
Andrea Jackson-Williams

We sought to determine whether receiving major re-selection (MRS) advising benefits undergraduate students' grade-point averages (GPAs). We used a quasi-experimental nonequivalent control group design to compare a treatment group (n = 219) of undergraduates who changed their majors after receiving MRS advising with a control group (n = 206) who changed majors without advising during the same semester as the treatment group. Findings showed that, on average, students who received MRS experienced no change in their program GPA but an increase in their semester GPA; however, the control group experienced a decrease in program and semester GPAs. Multiple regression analysis confirmed that MRS advising had a positive effect on posttest semester GPAs (β = .33, p &lt; .001) and program GPAs (β = .28, p &lt; .001). Implications for student advising are discussed.


2020 ◽  
pp. 089719002095917
Author(s):  
Lauren Fay ◽  
Georgeanna Rechner-Neven ◽  
Drayton A. Hammond ◽  
Joshua M. DeMott ◽  
Mary Jane Sullivan

Background: The differential diagnosis for thrombocytopenia in critical illness is often extensive. This study was performed to determine the incidence of thrombocytopenia in septic patients undergoing continuous renal replacement therapy (CRRT) versus those not undergoing CRRT. Objective: The primary outcome of this study was to compare the development of thrombocytopenia, defined as a platelet count ≤ 100 × 103/mm3, in septic patients within 5 days of time zero. Time zero was defined as the baseline platelet count upon hospital admission or CRRT initiation. Methods: An IRB approved, retrospective cohort study was conducted evaluating thrombocytopenia development in critically ill, septic patients who were initiated on CRRT versus those whom were not. Baseline and clinical characteristics were displayed using descriptive statistics. The primary outcome was evaluated overall and in subgroups of CRRT using Chi-square tests. Results: One hundred sixty patients, 80 per arm, were included in the study. Thrombocytopenia development within 5 days occurred more frequently in the renal replacement therapy (RRT) group compared to the control group (67.5% vs. 6.3%, p < 0.001). In the subgroup analysis of the RRT cohort, thrombocytopenia development within 5 days occurred more frequently in the continuous veno-venous hemofiltration (CVVH) group compared to the accelerated veno-venous hemofiltration (AVVH) group (76% vs. 53.3%, p = 0.049). Conclusion: There is a high likelihood that septic patients initiated on CRRT will develop thrombocytopenia during their hospital stay. Patients receiving CVVH may develop thrombocytopenia more frequently than those receiving AVVH. Overall, CRRT should remain a differential diagnosis for thrombocytopenia development in this patient population.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Małgorzata Kościelska ◽  
Joanna Matuszkiewicz-Rowińska ◽  
Krzysztof Zieniewicz ◽  
Marek Krawczyk ◽  
Dorota Giercuszkiewicz ◽  
...  

Abstract Background and Aims Orthotopic liver transplantation (OLt) is a technically complex surgical procedure associated with a major risk of hemodynamic instability and metabolic derangement, especially in patients with coexisting renal dysfunction. It remains uncertain if intraoperative renal replacement therapy (ioRRT) during OLt could be beneficial and in which subset of patients. Method A retrospective observational study of all adult patients undergoing intraoperative renal replacement therapy during OLt in our center from January 2010 till December 2018. IoRRT consisted of dialysis performed with the use of mobile Genius® single-pass batch dialysis system. Results 1319 OLts were conducted during the study period and 129 patients (9,8%) were treated with intraoperative dialysis. Seven patients with incomplete documentation were excluded from the study. The mean age of the cohort was 47,8±14,7 years, and 73 (60%) of patients were men. The mean calculated MELD score was 33,7±12,5. 41 procedures (33,6%) were retransplantations and 7 patients underwent simultaneous liver and kidney transplantation. 18 (14,7%) recipients had fulminant liver failure. 66 (54,1%) of patients were admitted to Intensive Care Unit before transplantation. The mean preoperative serum creatinine level was 2,95±1,15 mg/dl and 95 (78%) patients were undergoing renal replacement therapy or had creatinine level ≥2 mg/dl preoperatively. Others required emergent ioRRT due to intraoperative complications and development of significant hyperkalemia or acidosis. The mean preoperative serum potassium concentration was 4,23±0,68 mmol/l and the mean bicarbonate level was 19,8±4,5 mmol/l. Intraoperative dialysis was performed without anticoagulation. The mean surgical time was 400±90,6 minutes and the mean ioRRT duration remained 308 ±109 minutes. Dialysis circuit clotting occurred in 9 (7,3%) cases. There were no other adverse events of ioRRT. Achieved mean arterial bicarbonate and potassium levels after graft reperfusion were 18,5±3,6 and 3,8±0,95 mmol/l, respectively. In 111 cases dialysis was initiated at least 15 minutes before reperfusion. 13% of this subgroup (16 pts) experienced post-reperfusion syndrome (defined as a decrease in the mean arterial pressure of more than 30% of the baseline, occurring within first 5 minutes after the reperfusion of the graft and lasting at least 1 minute). Conclusion Our data suggests that intraoperative dialysis in severely ill patients with high MELD score is safe and effective. Lower than expected post-reperfusion syndrome occurrence needs to be confirmed in a study designed with appropriate control group.


1998 ◽  
Vol 31 (4) ◽  
pp. 333-337 ◽  
Author(s):  
Pedro Paulo Chieffi ◽  
Yvoty A.S. Sens ◽  
Maria Aparecida Paschoalotti ◽  
Luiz Antonio Miorin ◽  
Hélio Gomes C. Silva ◽  
...  

The frequency of infection by Cryptosporidium parvum was determined in two groups of renal patients submitted to immunosuppression. One group consisted of 23 renal transplanted individuals, and the other consisted of 32 patients with chronic renal insufficiency, periodically submitted to hemodialysis. A third group of 27 patients with systemic arterial hypertension, not immunosuppressed, was used as control. During a period of 18 months all the patients were submitted to faecal examination to detect C. parvum oocysts, for a total of 1 to 6 tests per patient. The results showed frequencies of C. parvum infection of 34.8%, 25% and 17.4%, respectively, for the renal transplanted group, the patients submitted to hemodialysis and the control group. Statistical analysis showed no significant differences among the three groups even though the frequency of C. parvum infection was higher in the transplanted group. However, when the number of fecal samples containing C. parvum oocysts was taken in account, a significantly higher frequency was found in the renal transplanted group.


Author(s):  
L. Korol ◽  
L. Mygal ◽  
O. Burdeyna ◽  
M. Kolesnyk

The aim of the research was to study the effect ofoxidative factors impact and modality of renal replacement therapy (BBT) on indices ofoxidative stress (OS) and resistance of erythrocytes membranes in patients with chronic kidney disease stage V(CKD VD) and anemie. Material and methods. The study involved 68 patients with CKD VD: 14 patients were treated by hemodiafiltration (HDF), 25 patients by hemodialysis (HD) and 29 patients by peritoneal dialysis (PD). The severity ofanemia was assessed according to the KDIGO (2012) criteria. The control group consisted of 30 healthy people of the same age and sex. Along with the standard diagnostic methods, we defined the content of malonic dialdehyde in serum (MDAs) and in erythrocytes (MDAe), the content of ceruloplasmin (CPs), transferrin (TBs) and SH-groups in the blood serum, the index of the OS (IOS), catalase activity in serum (CATs), glucose-6-phosphate dehydrogenase (G-6-PDHe) and total peroxidase activity (TPA) in erythrocyte, peroxide resistance (PR) of red blood cells and erythrocyte membrane permeability (EMP). Statistical analysis was performed using the programs of Microsoft Excel 7.0. Results. It has been stated that in the CKD VD patients in compatison with control group the MDAs content increased by 3.3 times and MDAe - 1.2 times, TBs content reduced by 34%, SH-groups - by 31%, TPAe - by 41% and G- 6-FDGe - by 58%, PB-by 60%; 4.6 times increased CATs activity and OSI; 2 times grew peroxide hemolysis (PH) and 1.3 times - EMP. The analysis (depending on the BBT modality) showed that the patients treated by HDF had typical MDAs increase by 3.9 times on a background of CPs by 24%, TBs - 33%, SH-groups - 25%, TPAe - 51%, G-6-PDHe - 42%; the increase in serum OSI - 5.4 times and 2.6 times in erythrocytes, PB - by 3.6 times and CATs activity by 3.5 times; HD group was characterized by the highest value of MDAe, OSI, PH and CATs, along with more expressed decrease of TBs indices, SH-groups, TPA and G-6-FDHe activity compared with rates in patients with HDF. The patients treated with PD had the lowest content of MDAs and the highest values on the background ofTPAe, the significant increase of CPs by 1.7 times and lowest TBs and G-6-PDHe. The patients with PD showed twice lower OS activity by OSI. Conclusion. Thus, in patients with CKD VD, who had HD, HDF or PD an anemie was associated with high OS activity and the increased degree of hemolysis. These changes are stipulated by BBT methods: for patients receiving HDF were typical the lowest rates of hemolysis and the highest degree ofprotection for erythrocytes, and for patients treated with HD - the highest OS.


Sign in / Sign up

Export Citation Format

Share Document