scholarly journals An observational feasibility study - does early limb ergometry affect oxygen delivery and uptake in intubated critically ill patients – a comparison of two assessment methods

2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Olive M Wilkinson ◽  
Andrew Bates ◽  
Rebecca Cusack

Abstract Background Early rehabilitation can reduce ventilation duration and improve functional outcomes in critically ill patients. Upper limb strength is associated with ventilator weaning. Passive muscle loading may preserve muscle fibre function, help recover peripheral muscle strength and improve longer term, post-hospital discharge function capacity. The physiological effects of initiating rehabilitation soon after physiological stabilisation of these patients can be concerning for clinicians. This study investigated the feasibility of measuring metabolic demand and the safety and feasibility of early upper limb passive ergometry. An additional comparison of results, achieved from simultaneous application of the methods, is reported. Methods This was an observational feasibility study undertaken in an acute teaching hospital’s General Intensive Care Unit in the United Kingdom. Twelve haemodynamically stable, mechanically ventilated patients underwent 30 minutes of arm ergometry. Cardiovascular and respiratory parameters were monitored. A Friedman test identified changes in physiological parameters. A metabolic cart was attached to the ventilator to measure oxygen uptake. Oxygen uptake was concurrently calculated by the reverse Fick method, utilising cardiac output from the LiDCO™ and paired mixed venous and arterial samples. A comparison of the two methods was made. Data collection began 10 minutes before ergometry and continued to recovery. Paired mixed venous and arterial samples were taken every 10 minutes. Results Twelve patients were studied; 9 male, median age 55 years, range (27–82), median APACHE score 18.5, range (7–31), median fraction inspired oxygen 42.5%, range (28–60). Eight patients were receiving noradrenaline. Mean dose was 0.07 mcg/kg/min, range (0.01–0.15). Early ergometry was well tolerated. There were no clinically significant changes in respiratory, haemodynamic or metabolic variables pre ergometry to end recovery. There was no significant difference between the two methods of calculating VO2 (p = 0.70). Conclusions We report the feasibility of using the reverse Fick method and indirect calorimetry to measure metabolic demand during early physical rehabilitation of critically ill patients. More research is needed to ascertain the most reliable method. Minimal change in metabolic demand supports the safety and feasibility of upper limb ergometry. These results will inform future study designs for further research into exercise response in critically ill patients. Trial Registration Clinicaltrials.gov No. NCT04383171. Registered on 06 May 2020 - Retrospectively registered. http://www.clinicaltrials.gov.

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Christina Scharf ◽  
Ines Schroeder ◽  
Michael Paal ◽  
Martin Winkels ◽  
Michael Irlbeck ◽  
...  

Abstract Background A cytokine storm is life threatening for critically ill patients and is mainly caused by sepsis or severe trauma. In combination with supportive therapy, the cytokine adsorber Cytosorb® (CS) is increasingly used for the treatment of cytokine storm. However, it is questionable whether its use is actually beneficial in these patients. Methods Patients with an interleukin-6 (IL-6) > 10,000 pg/ml were retrospectively included between October 2014 and May 2020 and were divided into two groups (group 1: CS therapy; group 2: no CS therapy). Inclusion criteria were a regularly measured IL-6 and, for patients allocated to group 1, CS therapy for at least 90 min. A propensity score (PS) matching analysis with significant baseline differences as predictors (Simplified Acute Physiology Score (SAPS) II, extracorporeal membrane oxygenation, renal replacement therapy, IL-6, lactate and norepinephrine demand) was performed to compare both groups (adjustment tolerance: < 0.05; standardization tolerance: < 10%). U-test and Fisher’s-test were used for independent variables and the Wilcoxon test was used for dependent variables. Results In total, 143 patients were included in the initial evaluation (group 1: 38; group 2: 105). Nineteen comparable pairings could be formed (mean initial IL-6: 58,385 vs. 59,812 pg/ml; mean SAPS II: 77 vs. 75). There was a significant reduction in IL-6 in patients with (p < 0.001) and without CS treatment (p = 0.005). However, there was no significant difference (p = 0.708) in the median relative reduction in both groups (89% vs. 80%). Furthermore, there was no significant difference in the relative change in C-reactive protein, lactate, or norepinephrine demand in either group and the in-hospital mortality was similar between groups (73.7%). Conclusion Our study showed no difference in IL-6 reduction, hemodynamic stabilization, or mortality in patients with Cytosorb® treatment compared to a matched patient population.


2020 ◽  
Vol 2 (1) ◽  
pp. 12
Author(s):  
Gehan A. F. Atia

Context: Central venous access device (CVAD) bundles for insertion and maintenance demonstrate a reduction in the frequency of complications and bloodstream infection when implemented with compliance monitoring, with the reported success of CVAD bundles. Aim: This study aimed to examine the effect of central venous catheter care bundle implementation on outcomes of critically ill patients. Methods: Quasi-experimental research (pre/post-test design) used to achieve the aim of this study. The study conducted at general and surgical intensive care units affiliated to Menoufia University and teaching hospital. Two study samples recruited in this study. All nurses working at the ICUs, as mentioned above, were recruited in this study. They were 6o critical care nurses. A convenient sample of all available critically ill patients at the time of the study was subjected to treatment via a central venous catheter. Four study tools used to collect the data of this study. These are a structured interview questionnaire, CVC nurses’ knowledge assessment questionnaire, nurses’ compliance assessment checklists, and patient complications assessment records. Results: The study result showed a highly statistically significant difference between pre and post-test knowledge scores of studied nurses regarding assisting line insertion, removal, maintenance, care, and infection control practices. Besides, a highly statistically significant difference between pre and post-test scores of nurses’ compliance to central venous catheter care practices of assisting in CVC insertion, blood sample withdrawal, medication and fluid administration, CVP measurements, CVC removal, and the management of central venous line complications. The study also revealed a highly statistically significant difference between the study and control group patients regarding the central venous catheter complications. However, signs of infection were the most frequent complications in both groups. Conclusion. The study concluded that a statistically significant difference between pre and post nurses’ knowledge and compliance with the CVC care bundle. The patients’ outcomes were also improved significantly after the implementation of the CVC care bundle compared to the controls. The study recommended the adoption of the current care bundle that should be disseminated and updated following the international organizations’ recommendation for implementing evidence-based practices for successful central line-associated bloodstream infection (CLABSI) prevention.


2020 ◽  
Vol 4 (4) ◽  
Author(s):  
Yannan Sun

 Objective: Investigate the effectiveness of nursing risk management in the care of critically ill patients in the respiratory unit. Methods: Among the critically ill respiratory patients admitted to our hospital between May 2019 and April 2020, 78 patients were randomly selected and divided into an observation group and a control group, each consisting of 39 patients. In the observation group, a nursing risk management model was implemented, i.e., patients' clinical symptoms were observed at any time to monitor their treatment satisfaction and the effectiveness of their care and routine care was implemented for the control group. Results: The heart rate, respiratory rate, and pH of patients in the observation group were more stable than those in the control group, and their respiratory status was better, with differences in data. There was also significant statistical significance (P<0.05). The incidence of patient-provider disputes, unplanned extubation, and unplanned events were lower in the observation group compared to the control group, and their data difference was statistically significant (P<0.05). The treatment satisfaction as well as the total effective rate of patients in the observation group was also much higher than that of the control group, and there was also a statistically significant difference in the data (P<0.05). Conclusion: The nursing risk management model has a significant therapeutic effect in the care of critically ill respiratory patients. Therefore, it is worth popularizing to use in the clinical nursing of respiratory critical patients.


Nutrients ◽  
2019 ◽  
Vol 11 (5) ◽  
pp. 987
Author(s):  
Ahmad Aljada ◽  
Ghada Fahad AlGwaiz ◽  
Demah AlAyadhi ◽  
Emad Masuadi ◽  
Mahmoud Zahra ◽  
...  

Purpose: This study examined the effect of permissive underfeeding compared to target feeding and intensive insulin therapy (IIT) compared to conventional insulin therapy (CIT) on the inflammatory mediators monocyte chemoattractant protein 1 (MCP-1), soluble intercellular adhesion molecule 1 (sICAM-1), and tissue factor (TF) in critically ill patients. Methodology: This was a substudy of a 2 × 2 factorial design randomized controlled trial in which intensive care unit (ICU) patients were randomized into permissive underfeeding compared to target feeding groups and into IIT compared to CIT groups (ISRCTN96294863). In this substudy, we included 91 patients with almost equal numbers across randomization groups. Blood samples were collected at baseline and at days 3, 5, and 7 of an ICU stay. Linear mixed models were used to assess the differences in MCP-1, sICAM-1, and TF across randomization groups over time. Results: Baseline characteristics were balanced across randomization groups. Daily caloric intake was significantly higher in the target feeding than in the permissive underfeeding groups (P-value < 0.01), and the daily insulin dose was significantly higher in the IIT than in the CIT groups (P-value < 0.01). MCP-1, sICAM-1, and TF did not show any significant difference between the randomization groups, while there was a time effect for MCP-1. Baseline sequential organ failure assessment (SOFA) score and platelets had a significant effect on sICAM-1 (P-value < 0.01). For TF, there was a significant association with age (P-value < 0.01). Conclusions: Although it has been previously demonstrated that insulin inhibits MCP-1, sICAM-1 in critically ill patients, and TF in non-critically ill patients, our study demonstrated that IIT in critically ill patients did not affect these inflammatory mediators. Similarly, caloric intake had a negligible effect on the inflammatory mediators studied.


2020 ◽  
Vol 8 (1) ◽  
Author(s):  
Yukari Aoyagi ◽  
Takuo Yoshida ◽  
Shigehiko Uchino ◽  
Masanori Takinami ◽  
Shoichi Uezono

Abstract Background The choice of intravenous infusion products for critically ill patients has been studied extensively because it can affect prognosis. However, there has been little research on drug diluents in this context. The purpose of this study is to evaluate the impact of diluent choice (saline or 5% dextrose in water [D5W]) on electrolyte abnormalities, blood glucose control, incidence of acute kidney injury (AKI), and mortality. Methods This before-after, two-group comparative, retrospective study enrolled adult patients who stayed for more than 48 h in a general intensive care unit from July 2015 to December 2018. We changed the default diluent for intermittent drug sets in our electronic ordering system from D5W to saline at the end of 2016. Results We included 844 patients: 365 in the D5W period and 479 in the saline period. Drug diluents accounted for 21.4% of the total infusion volume. The incidences of hypernatremia and hyperchloremia were significantly greater in the saline group compared to the D5W group (hypernatremia 27.3% vs. 14.6%, p < 0.001; hyperchloremia 36.9 % vs. 20.4%, p < 0.001). Multivariate analyses confirmed the similar effects (hypernatremia adjusted odds ratio (OR), 2.43; 95% confidence interval (CI), 1.54–3.82; hyperchloremia adjusted OR, 2.09; 95% CI, 1.31–3.34). There was no significant difference in the incidences of hyperglycemia, AKI, and mortality between the two groups. Conclusions Changing the diluent default from D5W to saline had no effect on blood glucose control and increased the incidences of hypernatremia and hyperchloremia.


2016 ◽  
Vol 50 (5) ◽  
pp. 823-830 ◽  
Author(s):  
Patrícia de Oliveira Salgado ◽  
Ludmila Christiane Rosa da Silva ◽  
Priscila Marinho Aleixo Silva ◽  
Tânia Couto Machado Chianca

Abstract OBJECTIVE To evaluate the effects of physical methods of reducing body temperature (ice pack and warm compression) in critically ill patients with fever. METHOD A randomized clinical trial involving 102 adult patients with tympanic temperature ≥ 38.3°C of an infectious focus, and randomized into three groups: Intervention I - ice pack associated with antipyretic; Intervention II - warm compress associated with antipyretic; and Control - antipyretic. Tympanic temperature was measured at 15 minute intervals for 3 hours. The effect of the interventions was evaluated through the Mann-Whitney test and Survival Analysis. "Effect size" calculation was carried out. RESULTS Patients in the intervention groups I and II presented greater reduction in body temperature. The group of patients receiving intervention I presented tympanic temperature below 38.3°C at 45 minutes of monitoring, while the value for control group was lower than 38.3°C starting at 60 minutes, and those who received intervention II had values lower than 38.3°C at 75 minutes of monitoring. CONCLUSION No statistically significant difference was found between the interventions, but with the intervention group I patients showed greater reduction in tympanic temperature compared to the other groups. Brazilian Registry of Clinical Trials: RBR-2k3kbq


2016 ◽  
Vol 2016 ◽  
pp. 1-8 ◽  
Author(s):  
Pattraporn Tajarernmuang ◽  
Arintaya Phrommintikul ◽  
Atikun Limsukon ◽  
Chaicharn Pothirat ◽  
Kaweesak Chittawatanarat

Background. An increase in the mean platelet volume (MPV) has been proposed as a novel prognostic indicator in critically ill patients.Objective. We conducted a systematic review and meta-analysis to determine whether there is an association between MPV and mortality in critically ill patients.Methods. We did electronic search in Medline, Scopus, and Embase up to November 2015.Results. Eleven observational studies, involving 3724 patients, were included. The values of initial MPV in nonsurvivors and survivors were not different, with the mean difference with 95% confident interval (95% CI) being 0.17 (95% CI: −0.04, 0.38;p=0.112). However, after small sample studies were excluded in sensitivity analysis, the pooling mean difference of MPV was 0.32 (95% CI: 0.04, 0.60;p=0.03). In addition, the MPV was observed to be significantly higher in nonsurvivor groups after the third day of admission. On the subgroup analysis, although patient types (sepsis or mixed ICU) and study type (prospective or retrospective study) did not show any significant difference between groups, the difference of MPV was significantly difference on the unit which had mortality up to 30%.Conclusions. Initial values of MPV might not be used as a prognostic marker of mortality in critically ill patients. Subsequent values of MPV after the 3rd day and the lower mortality rate unit might be useful. However, the heterogeneity between studies is high.


2020 ◽  
Vol 9 (4) ◽  
pp. 963 ◽  
Author(s):  
Mirjam Bachler ◽  
Tobias Hell ◽  
Johannes Bösch ◽  
Benedikt Treml ◽  
Bettina Schenk ◽  
...  

The current study aims to evaluate whether prophylactic anticoagulation using argatroban or an increased dose of unfractionated heparin (UFH) is effective in achieving the targeted activated partial thromboplastin time (aPTT) of more than 45 s in critically ill heparin-resistant (HR) patients. Patients were randomized either to continue receiving an increased dose of UFH, or to be treated with argatroban. The endpoints were defined as achieving an aPTT target of more than 45 s at 7 h and 24 h. This clinical trial was registered on clinicaltrials.gov (NCT01734252) and on EudraCT (2012-000487-23). A total of 42 patients, 20 patients in the heparin and 22 in the argatroban group, were included. Of the patients with continued heparin treatment 55% achieved the target aPTT at 7 h, while only 40% of this group maintained the target aPTT after 24 h. Of the argatroban group 59% reached the target aPTT at 7 h, while at 24 h 86% of these patients maintained the targeted aPTT. Treatment success at 7 h did not differ between the groups (p = 0.1000), whereas at 24 h argatroban showed significantly greater efficacy (p = 0.0021) than did heparin. Argatroban also worked better in maintaining adequate anticoagulation in the further course of the study. There was no significant difference in the occurrence of bleeding or thromboembolic complications between the treatment groups. In the case of heparin-resistant critically ill patients, argatroban showed greater efficacy than did an increased dose of heparin in achieving adequate anticoagulation at 24 h and in maintaining the targeted aPTT goal throughout the treatment phase.


1994 ◽  
Vol 22 (1) ◽  
pp. A191 ◽  
Author(s):  
William R. Auger ◽  
David B. Hoyt ◽  
F. Wayne Johnson ◽  
Diane Lewis ◽  
Joan Garcia ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document