scholarly journals Blood purification with a cytokine adsorber for the elimination of myoglobin in critically ill patients with severe rhabdomyolysis

Critical Care ◽  
2021 ◽  
Vol 25 (1) ◽  
Author(s):  
Christina Scharf ◽  
Uwe Liebchen ◽  
Michael Paal ◽  
Michael Irlbeck ◽  
Michael Zoller ◽  
...  

Abstract Background Rhabdomyolysis is frequently occurring in critically ill patients, resulting in a high risk of acute kidney injury (AKI) and potentially permanent kidney damage due to increased myoglobin levels. The extracorporeal elimination of myoglobin might be an approach to prevent AKI, but its molecular weight of 17 kDa complicates an elimination with conventional dialysis membranes. Question of interest is, if myoglobin can be successfully eliminated with the cytokine adsorber Cytosorb® (CS) integrated in a high-flux dialysis system. Methods Patients were included between 10/2014 and 05/2020 in the study population if they had an anuric renal failure with the need of renal replacement therapy, if CS therapy was longer than 90 min and if myoglobin level was > 5.000 ng/ml before treatment. The measurement times of the laboratory values were: d-1 = 24–36 h before CS, d0 = shortly before starting CS and d1 = 12–24 h after starting CS treatment. Statistical analysis were performed with Spearman’s correlation coefficient, Wilcoxon test with associated samples and linear regression analysis. Results Forty-three patients were included in the evaluation (median age: 56 years, 77% male patients, 32.6% ECMO therapy, median SAPS II: 80 points and in-hospital mortality: 67%). There was a significant equilateral correlation between creatine kinase (CK) and myoglobin at all measurement points. Furthermore, there was a significant reduction of myoglobin (p = 0.03, 95% confidence interval (CI): − 9030, − 908 ng/ml) during CS treatment, with a median relative reduction of 29%. A higher median reduction of 38% was seen in patients without ongoing rhabdomyolysis (CK decreased during CS treatment, n = 21). In contrast, myoglobin levels did not relevantly change in patients with increasing CK and therefore ongoing rhabdomyolysis (n = 22, median relative reduction 4%). Moreover, there was no significant difference in myoglobin elimination in patients with and without ECMO therapy. Conclusion Blood purification with Cytosorb® during high-flux dialysis led to a significant reduction of myoglobin in patients with severe rhabdomyolysis. The effect might be obscured by sustained rhabdomyolysis, which was seen in patients with rising CK during treatment. Prospective clinical trials would be useful in investigating its benefits in avoiding permanent kidney damage.

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Christina Scharf ◽  
Ines Schroeder ◽  
Michael Paal ◽  
Martin Winkels ◽  
Michael Irlbeck ◽  
...  

Abstract Background A cytokine storm is life threatening for critically ill patients and is mainly caused by sepsis or severe trauma. In combination with supportive therapy, the cytokine adsorber Cytosorb® (CS) is increasingly used for the treatment of cytokine storm. However, it is questionable whether its use is actually beneficial in these patients. Methods Patients with an interleukin-6 (IL-6) > 10,000 pg/ml were retrospectively included between October 2014 and May 2020 and were divided into two groups (group 1: CS therapy; group 2: no CS therapy). Inclusion criteria were a regularly measured IL-6 and, for patients allocated to group 1, CS therapy for at least 90 min. A propensity score (PS) matching analysis with significant baseline differences as predictors (Simplified Acute Physiology Score (SAPS) II, extracorporeal membrane oxygenation, renal replacement therapy, IL-6, lactate and norepinephrine demand) was performed to compare both groups (adjustment tolerance: < 0.05; standardization tolerance: < 10%). U-test and Fisher’s-test were used for independent variables and the Wilcoxon test was used for dependent variables. Results In total, 143 patients were included in the initial evaluation (group 1: 38; group 2: 105). Nineteen comparable pairings could be formed (mean initial IL-6: 58,385 vs. 59,812 pg/ml; mean SAPS II: 77 vs. 75). There was a significant reduction in IL-6 in patients with (p < 0.001) and without CS treatment (p = 0.005). However, there was no significant difference (p = 0.708) in the median relative reduction in both groups (89% vs. 80%). Furthermore, there was no significant difference in the relative change in C-reactive protein, lactate, or norepinephrine demand in either group and the in-hospital mortality was similar between groups (73.7%). Conclusion Our study showed no difference in IL-6 reduction, hemodynamic stabilization, or mortality in patients with Cytosorb® treatment compared to a matched patient population.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Christina Scharf ◽  
Uwe Liebchen ◽  
Michael Paal ◽  
Andrea Becker-Pennrich ◽  
Michael Irlbeck ◽  
...  

AbstractThere are different methods of artificial liver support for patients with acute liver dysfunction (ALD). However, CytoSorb (CS) might be a new approved option for those patients. Question of interest is whether the elimination performance of CS was comparable to that of advanced organ support (ADVOS). Patients, treated with CS (integrated into high-flux dialysis) or ADVOS and a total bilirubin > 10 mg/dl were included. Laboratory parameters were evaluated before starting therapy (d0) and 12–24 h thereafter (d1). The Wilcoxon-test with associated samples was used for statistical analysis. Thirty-nine patients (33 CS, 6 ADVOS) were included. The median bilirubin at d0 was 16.9 and 17.7 mg/dl and at d1 was 13.2 and 15.9 mg/dl, in the CS and ADVOS group, respectively. There was a significant bilirubin reduction as well in the CS group (p < 0.001, median relative reduction: 22.5%) as in the ADVOS group (p = 0.028, median relative reduction: 22.8%). There was no significant difference in the relative bilirubin reduction between CS and ADVOS therapies. The use of CytoSorb and ADVOS in patients with ALD led to a significant and comparable decrease in total bilirubin. The easy use of CS might be an advantage compared to other procedures.


Author(s):  
Donaliazarti Donaliazarti ◽  
Rismawati Yaswir ◽  
Hanifah Maani ◽  
Efrida Efrida

Metabolic acidosis is prevalent among critically ill patients and the common cause of metabolic acidosis in ICU is lactic acidosis. However, not all ICUs can provide lactate measurement. The traditional method that uses Henderson-Hasselbach equation (completed with BE and AG) and alternative method consisting of Stewart and its modification (BDEgap and SIG), are acid-base balance parameters commonly used by clinicians to determine metabolic acidosis in critically ill patients. The objective of this study was to discover the association between acid-base parameters (BE, AGobserved, AGcalculated, SIG, BDEgap) with lactate level in critically ill patients with metabolic acidosis. This was an analytical study with a cross-sectional design. Eighty-four critically ill patients hospitalized in the ICU department Dr. M. Djamil Padang Hospital were recruited in this study from January to September 2016. Blood gas analysis and lactate measurement were performed by potentiometric and amperometric method while electrolytes and albumin measurement were done by ISE and colorimetric method (BCG). Linear regression analysis was used to evaluate the association between acid-base parameters with lactate level based on p-value less than 0.05. Fourty five (54%) were females and thirty-nine (46%) were males with participant’s ages ranged from 18 to 81 years old. Postoperative was the most reason for ICU admission (88%). Linear regression analysis showed that p-value for BE, AGobserved, AGcalculated, SIG and BDEgap were 119; 0.967; 0.001; 0.001; 0.689, respectively. Acid-base balance parameters which were mostly associated with lactate level in critically ill patients with metabolic acidosis were AGcalculated and SIG. 


2020 ◽  
Vol 2 (1) ◽  
pp. 12
Author(s):  
Gehan A. F. Atia

Context: Central venous access device (CVAD) bundles for insertion and maintenance demonstrate a reduction in the frequency of complications and bloodstream infection when implemented with compliance monitoring, with the reported success of CVAD bundles. Aim: This study aimed to examine the effect of central venous catheter care bundle implementation on outcomes of critically ill patients. Methods: Quasi-experimental research (pre/post-test design) used to achieve the aim of this study. The study conducted at general and surgical intensive care units affiliated to Menoufia University and teaching hospital. Two study samples recruited in this study. All nurses working at the ICUs, as mentioned above, were recruited in this study. They were 6o critical care nurses. A convenient sample of all available critically ill patients at the time of the study was subjected to treatment via a central venous catheter. Four study tools used to collect the data of this study. These are a structured interview questionnaire, CVC nurses’ knowledge assessment questionnaire, nurses’ compliance assessment checklists, and patient complications assessment records. Results: The study result showed a highly statistically significant difference between pre and post-test knowledge scores of studied nurses regarding assisting line insertion, removal, maintenance, care, and infection control practices. Besides, a highly statistically significant difference between pre and post-test scores of nurses’ compliance to central venous catheter care practices of assisting in CVC insertion, blood sample withdrawal, medication and fluid administration, CVP measurements, CVC removal, and the management of central venous line complications. The study also revealed a highly statistically significant difference between the study and control group patients regarding the central venous catheter complications. However, signs of infection were the most frequent complications in both groups. Conclusion. The study concluded that a statistically significant difference between pre and post nurses’ knowledge and compliance with the CVC care bundle. The patients’ outcomes were also improved significantly after the implementation of the CVC care bundle compared to the controls. The study recommended the adoption of the current care bundle that should be disseminated and updated following the international organizations’ recommendation for implementing evidence-based practices for successful central line-associated bloodstream infection (CLABSI) prevention.


2020 ◽  
Vol 4 (4) ◽  
Author(s):  
Yannan Sun

 Objective: Investigate the effectiveness of nursing risk management in the care of critically ill patients in the respiratory unit. Methods: Among the critically ill respiratory patients admitted to our hospital between May 2019 and April 2020, 78 patients were randomly selected and divided into an observation group and a control group, each consisting of 39 patients. In the observation group, a nursing risk management model was implemented, i.e., patients' clinical symptoms were observed at any time to monitor their treatment satisfaction and the effectiveness of their care and routine care was implemented for the control group. Results: The heart rate, respiratory rate, and pH of patients in the observation group were more stable than those in the control group, and their respiratory status was better, with differences in data. There was also significant statistical significance (P<0.05). The incidence of patient-provider disputes, unplanned extubation, and unplanned events were lower in the observation group compared to the control group, and their data difference was statistically significant (P<0.05). The treatment satisfaction as well as the total effective rate of patients in the observation group was also much higher than that of the control group, and there was also a statistically significant difference in the data (P<0.05). Conclusion: The nursing risk management model has a significant therapeutic effect in the care of critically ill respiratory patients. Therefore, it is worth popularizing to use in the clinical nursing of respiratory critical patients.


2021 ◽  
Vol 10 (2) ◽  
Author(s):  
Diana K. Sarkisian ◽  
Natalia V. Chebotareva ◽  
Valerie McDonnell ◽  
Armen V. Oganesyan ◽  
Tatyana N. Krasnova ◽  
...  

Background — Acute kidney injury (AKI) reaches 29% in the intensive care unit (ICU). Our study aimed to determine the prevalence, features, and the main AKI factors in critically ill patients with coronavirus disease 2019 (COVID-19). Material and Methods — The study included 37 patients with COVID-19. We analyzed the total blood count test results, biochemical profile panel, coagulation tests, and urine samples. We finally estimated the markers of kidney damage and mortality. Result — All patients in ICU had proteinuria, and 80.5% of patients had hematuria. AKI was observed in 45.9% of patients. Independent risk factors were age more than 60 years, increased C-reactive protein (CRP) level, and decreased platelet count. Conclusion — Kidney damage was observed in most critically ill patients with COVID-19. The independent risk factors for AKI in critically ill patients were elderly age, a cytokine response with a high CRP level.


2021 ◽  
Vol 20 (4) ◽  
pp. 81-94
Author(s):  
Artem V. Marukhov ◽  
Elena V. Murzina ◽  
Mikhail V. Zakharov ◽  
Genrikh A. Sofronov ◽  
Lyudmila V. Buryakova ◽  
...  

The relevance. Meropenem is a broad-spectrum carbapenem antibiotic widely used to treat patients with sepsis / septic shock. Critically ill patients are usually supported with one of the forms extracorporeal blood purification. However, data on the effect of various extracorporeal support techniques on the pharmacokinetics and pharmacodynamics of meropenem are insufficient or contradictory. Aim: To evaluate the effectiveness of meropenem dosage regimens in the treatment of septic patients during extracorporeal blood purification. Materials and methods. Plasma concentrations of meropenem were monitored in three critically ill patients with sepsis or septic shock. Patients were treated using various extracorporeal support techniques. Meropenem was used as empirical antibacterial mono- or complex therapy (1 g every 8 or 12 hours). Meropenem concentrations in plasma were determined by validated assay methods on Acquity ultraefficient liquid chromatography (UPLC) H-Class system. Results. It is shown that the meropenem plasma concentration in critically ill patients changes significantly. It was found that the standard meropenem dosing regimens in patients with sepsis / septic shock during continuous hemodiafiltration do not ensure the achievement of the PK/PD target of 100% TMIC for sensitive strains (MIC2 mg/L) and for intermediate resistance pathogens (2MIC8 mg/L). Continuous hemofiltration and selective adsorption of lipopolysaccharide have a less pronounced effect on the clearance of meropenem. Conclusion. To increase the effectiveness of antibacterial therapy, it is necessary to conduct research aimed at developing protocols for dosing antibacterial drugs for the treatment of sepsis during extracorporeal blood purification.


Nutrients ◽  
2019 ◽  
Vol 11 (5) ◽  
pp. 987
Author(s):  
Ahmad Aljada ◽  
Ghada Fahad AlGwaiz ◽  
Demah AlAyadhi ◽  
Emad Masuadi ◽  
Mahmoud Zahra ◽  
...  

Purpose: This study examined the effect of permissive underfeeding compared to target feeding and intensive insulin therapy (IIT) compared to conventional insulin therapy (CIT) on the inflammatory mediators monocyte chemoattractant protein 1 (MCP-1), soluble intercellular adhesion molecule 1 (sICAM-1), and tissue factor (TF) in critically ill patients. Methodology: This was a substudy of a 2 × 2 factorial design randomized controlled trial in which intensive care unit (ICU) patients were randomized into permissive underfeeding compared to target feeding groups and into IIT compared to CIT groups (ISRCTN96294863). In this substudy, we included 91 patients with almost equal numbers across randomization groups. Blood samples were collected at baseline and at days 3, 5, and 7 of an ICU stay. Linear mixed models were used to assess the differences in MCP-1, sICAM-1, and TF across randomization groups over time. Results: Baseline characteristics were balanced across randomization groups. Daily caloric intake was significantly higher in the target feeding than in the permissive underfeeding groups (P-value < 0.01), and the daily insulin dose was significantly higher in the IIT than in the CIT groups (P-value < 0.01). MCP-1, sICAM-1, and TF did not show any significant difference between the randomization groups, while there was a time effect for MCP-1. Baseline sequential organ failure assessment (SOFA) score and platelets had a significant effect on sICAM-1 (P-value < 0.01). For TF, there was a significant association with age (P-value < 0.01). Conclusions: Although it has been previously demonstrated that insulin inhibits MCP-1, sICAM-1 in critically ill patients, and TF in non-critically ill patients, our study demonstrated that IIT in critically ill patients did not affect these inflammatory mediators. Similarly, caloric intake had a negligible effect on the inflammatory mediators studied.


2018 ◽  
Vol 55 (3) ◽  
pp. 283-289 ◽  
Author(s):  
Isabela Bernasconi JOSÉ ◽  
Vânia Aparecida LEANDRO-MERHI ◽  
José Luis Braga de AQUINO

ABSTRACT BACKGROUND: Enteral nutritional therapy (ENT) is the best route for the nutrition of critically ill patients with improved impact on the clinical treatment of such patients. OBJECTIVE: To investigate the energy and protein supply of ENT in critically ill in-patients of an Intensive Care Unit (ICU). METHODS: Prospective longitudinal study conducted with 82 critically ill in-patients of an ICU, receiving ENT. Anthropometric variables, laboratory tests (albumin, CRP, CRP/albumin ratio), NUTRIC-score and Nutritional Risk Screening (NRS-2002), energy and protein goals, and the inadequacies and complications of ENT were assessed. Statistical analysis was performed using the Chi-square or Fischer tests and the Wilcoxon test. RESULTS: A total of 48.78% patients were at high nutritional risk based on NUTRIC score. In the CRP/albumin ratio, 85.37% patients presented with a high risk of complications. There was a statistically significant difference (P<0.0001) for all comparisons made between the target, prescription and ENT infusion, and 72% of the quantities prescribed for both calories and proteins was infused. It was observed that the difference between the prescription and the infusion was 14.63% (±10.81) for calories and 14.21% (±10.5) for proteins, with statistically significant difference (P<0.0001). In the relationship between prescription and infusion of calories and proteins, the only significant association was that of patients at high risk of CRP/albumin ratio, of which almost 94% received less than 80% of the energy and protein volume prescribed (P=0.0111). CONCLUSION: The administration of ENT in severely ill patients does not meet their actual energy and protein needs. The high occurrence of infusion inadequacies, compared to prescription and to the goals set can generate a negative nutritional balance.


2020 ◽  
Vol 8 (1) ◽  
Author(s):  
Yukari Aoyagi ◽  
Takuo Yoshida ◽  
Shigehiko Uchino ◽  
Masanori Takinami ◽  
Shoichi Uezono

Abstract Background The choice of intravenous infusion products for critically ill patients has been studied extensively because it can affect prognosis. However, there has been little research on drug diluents in this context. The purpose of this study is to evaluate the impact of diluent choice (saline or 5% dextrose in water [D5W]) on electrolyte abnormalities, blood glucose control, incidence of acute kidney injury (AKI), and mortality. Methods This before-after, two-group comparative, retrospective study enrolled adult patients who stayed for more than 48 h in a general intensive care unit from July 2015 to December 2018. We changed the default diluent for intermittent drug sets in our electronic ordering system from D5W to saline at the end of 2016. Results We included 844 patients: 365 in the D5W period and 479 in the saline period. Drug diluents accounted for 21.4% of the total infusion volume. The incidences of hypernatremia and hyperchloremia were significantly greater in the saline group compared to the D5W group (hypernatremia 27.3% vs. 14.6%, p < 0.001; hyperchloremia 36.9 % vs. 20.4%, p < 0.001). Multivariate analyses confirmed the similar effects (hypernatremia adjusted odds ratio (OR), 2.43; 95% confidence interval (CI), 1.54–3.82; hyperchloremia adjusted OR, 2.09; 95% CI, 1.31–3.34). There was no significant difference in the incidences of hyperglycemia, AKI, and mortality between the two groups. Conclusions Changing the diluent default from D5W to saline had no effect on blood glucose control and increased the incidences of hypernatremia and hyperchloremia.


Sign in / Sign up

Export Citation Format

Share Document