P-OGC07 The Role of Carbohydrate Loading on Lactate and Glucose Levels in Upper GI Cancer Surgery

2021 ◽  
Vol 108 (Supplement_9) ◽  
Author(s):  
Charlotte Turnbull ◽  
George Hallward ◽  
Andrew Davies ◽  
Mark Kelly ◽  
James Gossage ◽  
...  

Abstract Background Surgical stress is a significant factor in metabolic dysregulation in the perioperative setting. Its impact on insulin resistance is regarded as one of the most detrimental effects, contributing to post-operative complications and poor outcomes. Clinical markers of this include glucose and lactate levels, with hyperglycaemia and hyperlactataemia the predicted responses by the body. One way of minimising the impact of surgical stress is pre-operative carbohydrate loading, which in theory will provide more substrate for metabolism. Our aim was to investigate whether carbohydrate loading had any impact on lactate and glucose levels in patients undergoing upper gastrointestinal cancer resections. Methods A retrospective observational feasibility study was performed looking at 42 patients who had undergone either an oesophagectomy or gastrectomy. Patients were divided depending on whether they received pre-operative carbohydrate loading. Lactate and glucose levels both intra-operatively and post-operatively were collected. Mean difference was compared between the two groups at 4 hours intra-operatively, 2 hours post-operatively and 12 hours post-operatively using unpaired t tests, with significance at P < 0.05. Variance between the two groups was analysed. Secondary outcomes included analysis based on type of operation, anastomotic leaks, and post-operative intravenous fluid use in the first 24 hours. Results There was no statistically significant difference in lactate levels between the two test groups at any time point. Mean difference at intra-operative 4 hours 0.0408mmol/L (+/- 0.2537, P = 0.8731); post-operative 2 hours 0.2697mmol/L (+/- 0.3008, P = 0.3754); post-operative 12 hours 0.2327mmol/L (+/- 0.2368, P = 0.3318). Glucose levels at the same time points were not significantly different: intra-operative 4 hours 0.068mmol/L (+/- 0.5322, P = 0.5746); post-operative 2 hours -0.2649mmol/L (+/- 0.4679, P = 0.5746); post-operative 12 hours 0.3773mmol/L (+/- 0.3629, P = 0.305). Secondary outcomes did not show any statistically significant differences between analysed groups. Conclusions Pre-operative carbohydrate loading does not seem to influence lactate or glucose levels in these patients either intra-operatively or post-operatively. The lack of significant differences between the two cohorts may be due to underpowering of the sample size, as this is a small feasibility study. We assume that carbohydrate loading would reduce insulin resistance and therefore lactate and glucose levels. However, could it be that carbohydrate loading is not having as much of an effect on patient metabolism as we think? A larger prospective study is recommended to investigate its impact on clinical biochemistry and patient outcomes.

2021 ◽  
Vol 5 (1) ◽  
Author(s):  
Åsa Kettis ◽  
Hanna Fagerlind ◽  
Jan-Erik Frödin ◽  
Bengt Glimelius ◽  
Lena Ring

Abstract Background Effective patient-physician communication can improve patient understanding, agreement on treatment and adherence. This may, in turn, impact on clinical outcomes and patient quality of life (QoL). One way to improve communication is by using patient-reported outcome measures (PROMs). Heretofore, studies of the impact of using PROMs in clinical practice have mostly evaluated the use of standardized PROMs. However, there is reason to believe that individualized instruments may be more appropriate for this purpose. The aim of this study is to compare the effectiveness of the standardized QoL-instrument, the European Organization for Research and Treatment of Cancer Quality of Life C-30 (EORTC-QOL-C30) and the individualized QoL instrument, the Schedule for the Evaluation of Individual Quality of Life-Direct Weighting (SEIQoL-DW), in clinical practice. Methods In a prospective, open-label, controlled intervention study at two hospital out-patient clinics, 390 patients with gastrointestinal cancer were randomly assigned either to complete the EORTC-QOL-C30 or the SEIQoL-DW immediately before the consultation, with their responses being shared with their physician. This was repeated in 3–5 consultations over a period of 4–6 months. The primary outcome measure was patients’ health-related QoL, as measured by FACIT-G. Patients’ satisfaction with the consultation and survival were secondary outcomes. Results There was no significant difference between the groups with regard to study outcomes. Neither intervention instrument resulted in any significant changes in health-related QoL, or in any of the secondary outcomes, over time. This may reflect either a genuine lack of effect or sub-optimization of the intervention. Since there was no comparison to standard care an effect in terms of lack of deterioration over time cannot be excluded. Conclusions Future studies should focus on the implementation process, including the training of physicians to use the instruments and their motivation for doing so. The effects of situational use of standardized or individualized instruments should also be explored. The effectiveness of the different approaches may depend on contextual factors including physician and patient preferences.


Hypertension ◽  
2016 ◽  
Vol 68 (suppl_1) ◽  
Author(s):  
Matthew A Sparks ◽  
Stacy Johnson ◽  
Rishav Adhikari ◽  
Edward Diaz ◽  
Aaron Kupin ◽  
...  

Blockade of the renin angiotensin system (RAS) reduces albuminuria, attenuates hyperfiltration, and slows the progression of diabetic nephropathy (DN) by preventing vasoconstriction and subsequent increases in glomerular hydrostatic pressure. Since RAS blockade disrupts Ang II signaling in all tissues, the specific contribution of vascular actions of AT1 receptors in DN has been difficult to delineate. Therefore, we generated 129 SvEv mice with cell-specific loss of AT1A from VSMCs (SMKOs) using Cre-loxp . To eliminate AT1R from VSMCs, we crossed the SMKO mice with AT1BR -/- mice, lacking the minor AT1B isoform. To study the impact of vascular AT1R in DN, we crossed the AT1B- null SMKOs with mice having the Ins2 C96Y AKITA mutation, which develop DM1 early. To enhance kidney injury, mice underwent uninephrectomy (UNX) at 11wks. Blood glucose levels were elevated (~500mg/dL) and similar at 10, 16 and 24wks between the two groups. Prior to UNX, albuminuria was similar between Control AKITA and AT1B- null SMKO AKITA (62±10 Control AKITA versus 107±27 μg/24hrs SMKO AKITA, P=NS). Albuminuria increased with age in both Control Akita and AT1B- null SMKO AKITA but without significant differences between the groups at 16wks (307±106 vs 313±117 μg/24hrs; P=NS) or 24wks (494±236 versus 730±217 μg/24hrs; P=NS), despite a trend toward higher albuminuria in AT1B- null SMKO AKITAs. There was no significant difference in GFR (using FITC-inulin) between non-diabetic Control and AT1B- null SMKO (15.6±1.2 vs 14.8±0.8 μl/min/g BW), and hyperfiltration was observed in both Control AKITA (23.7±2.4 μl/min/g BW; P=0.003) and AT1B- null SMKO AKITA mice (20.7±1.7 μl/min/g BW; P=0.01) relative to their non-diabetic comparators. However, there was no significant difference in GFR between ControlAKITA and AT1B- null SMKO AKITA (P=NS). Finally we measured mRNA levels of putative kidney injury markers by RTqPCR and found no differences in levels of Col1A1 , NGAL , or TGFB1 mRNA between Control AKITA and AT1B null SMKO AKITA. Our studies indicate that the absence of vascular AT1R responses is not sufficient to reduce albuminuria and prevent hyperfiltration in a mouse model of DN. This suggests that blockade of AT1R in other cell lineages may contribute to beneficial actions of ARBs in DN.


2020 ◽  
Vol 102-B (7) ◽  
pp. 912-917 ◽  
Author(s):  
Muhammad Tahir ◽  
Ejaz A. Chaudhry ◽  
Faridullah K. Zimri ◽  
Nadeem Ahmed ◽  
Saeed A. Shaikh ◽  
...  

Aims It has been generally accepted that open fractures require early skeletal stabilization and soft-tissue reconstruction. Traditionally, a standard gauze dressing was applied to open wounds. There has been a recent shift in this paradigm towards negative pressure wound therapy (NPWT). The aim of this study was to compare the clinical outcomes in patients with open tibial fractures receiving standard dressing versus NPWT. Methods This multicentre randomized controlled trial was approved by the ethical review board of a public sector tertiary care institute. Wounds were graded using Gustilo-Anderson (GA) classification, and patients with GA-II to III-C were included in the study. To be eligible, the patient had to present within 72 hours of the injury. The primary outcome of the study was patient-reported Disability Rating Index (DRI) at 12 months. Secondary outcomes included quality of life assessment using 12-Item Short-Form Health Survey questionnaire (SF-12), wound infection rates at six weeks and nonunion rates at 12 months. Logistic regression analysis and independent-samples t-test were applied for secondary outcomes. Analyses of primary and secondary outcomes were performed using SPSS v. 22.0.1 and p-values of < 0.05 were considered significant. Results A total of 486 patients were randomized between January 2016 and December 2018. Overall 206 (49.04%) patients underwent NPWT, while 214 (50.95%) patients were allocated to the standard dressing group. There was no statistically significant difference in DRI at 12 months between NPWT and standard dressing groups (mean difference 0.5; 95% confidence interval (CI) -0.08 to 1.1; p = 0.581). Regarding SF-12 scores at 12 months follow-up, there was no significant difference at any point from injury until 12 months (mean difference 1.4; 95% CI 0.7 to 1.9; p = 0.781). The 30-day deep infection rate was slightly higher in the standard gauze dressing group. The non-union odds were also comparable (odds ratio (OR) 0.90, 95% CI 0.56 to 1.45; p = 0.685). Conclusion Our study concludes that NPWT therapy does not confer benefit over standard dressing technique for open fractures. The DRI, SF-12 scores, wound infection, and nonunion rates were analogous in both study groups. We suggest surgeons continue to use cheaper and more readily available standard dressings. Cite this article: Bone Joint J 2020;102-B(7):912–917.


2018 ◽  
Vol 36 (03) ◽  
pp. 277-284 ◽  
Author(s):  
M. Pallister ◽  
J. Ballas ◽  
J. Kohn ◽  
C. S. Eppes ◽  
M. Belfort ◽  
...  

Objective To evaluate the impact of a standardized surgical technique for primary cesarean deliveries (CDs) on operative time and surgical morbidity. Materials and Methods Two-year retrospective chart review of primary CD performed around the implementation of a standardized CD surgical technique. The primary outcome was total operative time (TOT). Secondary outcomes included incision-to-delivery time (ITDT), surgical site infection, blood loss, and maternal and fetal injuries. Results When comparing pre- versus postimplementation surgical times, there was no significant difference in TOT (76.5 vs. 75.9 minutes, respectively; p = 0.42) or ITDT (9.8 vs. 8.8 minutes, respectively; p = 0.06) when the entire cohort was analyzed. Subgroup analysis of CD performed early versus late in an academic year among the pre- and postimplementation groups showed no significant difference in TOT (79.3 early vs. 73.8 minutes late; p = 0.10) or ITDT (10.8 early vs. 8.8 minutes late; p = 0.06) within the preimplementation group. In the postimplementation group, however, there was significant decrease in TOT (80.5 early vs. 71.3 minutes late; p = 0.02) and ITDT (10.6 early vs. 6.8 minutes late; p < 0.01). Secondary outcomes were similar for both groups. Conclusion A standardized surgical technique combined with surgical experience can decrease TOT and ITDT in primary CD without increasing maternal morbidity.


2018 ◽  
Vol 46 (5) ◽  
pp. 453-462 ◽  
Author(s):  
N. L. Pillinger ◽  
J. L. Robson ◽  
P. C. A. Kam

In this narrative review, we describe the physiological basis for nutritional prehabilitation and evaluate the clinical evidence for its current roles in the perioperative period. Surgical stress and fasting induce insulin resistance as a result of altered mitochondrial function. Insulin resistance in the perioperative period leads to increased morbidity in a dose-dependent fashion, while preoperative carbohydrate loading attenuates insulin resistance, minimises protein loss and improves postoperative muscle function. Carbohydrate loading is an established practice in many countries and a key component of enhanced recovery after surgery (ERAS) programs, yet its independent effects on clinical outcomes remain unclear. Amino acid supplements may confer additional positive effects on a number of markers of clinical outcomes in the perioperative period, but their current role is also poorly defined. Clinical studies evaluating nutritional interventions have been marred by conflicting data, which may be due to small sample sizes, as well as heterogeneity of patients and surgical procedures. At present, it is known that carbohydrate loading is safe and improves patients’ wellbeing, but does not appear to influence length of hospital stay or rate of postoperative complications. This should be appreciated before its routine inclusion in ERAS programs.


2019 ◽  
Vol 20 (3) ◽  
pp. 605 ◽  
Author(s):  
Kenji Imai ◽  
Koji Takai ◽  
Tatsunori Hanai ◽  
Atsushi Suetsugu ◽  
Makoto Shiraki ◽  
...  

Diabetes mellitus (DM) is a risk factor for hepatocellular carcinoma (HCC). The purpose of this study was to investigate the impact of the disorder of glucose metabolism on the recurrence of HCC after curative treatment. Two hundred and eleven patients with HCC who received curative treatment in our hospital from 2006 to 2017 were enrolled in this study. Recurrence-free survival was estimated using the Kaplan–Meier method, and the differences between the groups partitioned by the presence or absence of DM and the values of hemoglobin A1c (HbA1c), fasting plasma glucose (FPG), fasting immunoreactive insulin (FIRI), and homeostasis model assessment-insulin resistance (HOMA-IR) were evaluated using the log-rank test. There were no significant differences in the recurrence-free survival rate between the patients with and without DM (p = 0.144), higher and lower levels of HbA1c (≥6.5 and <6.5%, respectively; p = 0.509), FPG (≥126 and <126 mg/dL, respectively; p = 0.143), and FIRI (≥10 and <10 μU/mL, respectively; p = 0.248). However, the higher HOMA-IR group (≥2.3) had HCC recurrence significantly earlier than the lower HOMA-IR group (<2.3, p = 0.013). Moreover, there was a significant difference between the higher and lower HOMA-IR groups without DM (p = 0.009), and there was no significant difference between those groups with DM (p = 0.759). A higher HOMA-IR level, particularly in non-diabetic patients, was a significant predictor for HCC recurrence after curative treatment.


2014 ◽  
Vol 121 (6) ◽  
pp. 1354-1358 ◽  
Author(s):  
Ralph Rahme ◽  
Sharon D. Yeatts ◽  
Todd A. Abruzzo ◽  
Lincoln Jimenez ◽  
Liqiong Fan ◽  
...  

Object The role of endovascular therapy in patients with acute ischemic stroke and a solitary M2 occlusion remains unclear. Through a pooled analysis of 3 interventional stroke trials, the authors sought to analyze the impact of successful early reperfusion of M2 occlusions on patient outcome. Methods Patients with a solitary M2 occlusion were identified from the Prolyse in Acute Cerebral Thromboembolism (PROACT) II, Interventional Management of Stroke (IMS), and IMS II trial databases and were divided into 2 groups: successful reperfusion (thrombolysis in cerebral infarction [TICI] 2–3) at 2 hours and failed reperfusion (TICI 0–1) at 2 hours. Baseline characteristics and clinical outcomes were compared. Results Sixty-three patients, 40 from PROACT II and 23 from IMS and IMS II, were identified. Successful early angiographic reperfusion (TICI 2–3) was observed in 31 patients (49.2%). No statistically significant difference in the rates of intracerebral hemorrhage (60.9% vs 47.6%, p = 0.55) or mortality (19.4% vs 15.6%, p = 0.75) was observed. However, there was a trend toward higher incidence of symptomatic hemorrhage in the TICI 2–3 group (17.4% vs 0%, p = 0.11). There was also a trend toward higher baseline glucose levels in this group (151.5 mg/dl vs 129.6 mg/ dl, p = 0.09). Despite these differences, the rate of functional independence (modified Rankin Scale Score 0–2) at 3 months was similar (TICI 2–3, 58.1% vs TICI 0–1, 53.1%; p = 0.80). Conclusions A positive correlation between successful early reperfusion and clinical outcome could not be demonstrated for patients with M2 occlusion. Irrespective of reperfusion status, such patients have better outcomes than those with more proximal occlusions, with more than 50% achieving functional independence at 3 months.


2021 ◽  
Vol 14 ◽  
pp. 73-76
Author(s):  
Blake Buzard ◽  
Patrick Evans ◽  
Todd Schroeder

Introduction: Blood cultures are the gold standard for identifying bloodstream infections. The Clinical and Laboratory Standards Institute recommends a blood culture contamination rate of <3%. Contamination can lead to misdiagnosis, increased length of stay and hospital costs, unnecessary testing and antibiotic use. These reasons led to the development of initial specimen diversion devices (ISDD). The purpose of this study is to evaluate the impact of an initial specimen diversion device on rates of blood culture contamination in the emergency department.  Methods: This was a retrospective, multi-site study including patients who had blood cultures drawn in an emergency department. February 2018 to April 2018, when an ISDD was not utilized, was compared with June 2019 to August 2019, a period where an ISDD was being used. The primary outcome was total blood culture contamination. Secondary outcomes were total hospital cost, hospital and intensive care unit length of stay, vancomycin days of use, vancomycin serum concentrations obtained, and repeat blood cultures obtained.  Results: A statistically significant difference was found in blood culture contamination rates in the Pre-ISDD group vs the ISDD group (7.47% vs 2.59%, p<0.001). None of the secondary endpoints showed a statistically significant difference. Conclusions: Implementation of an ISDD reduces blood culture contamination in a statistically significant manner. However, we were unable to capture any statistically significant differences in the secondary outcomes.


2021 ◽  
Author(s):  
Emilia Scheidecker ◽  
Benjamin Pereira-Zimmermann ◽  
Arne Potreck ◽  
Dominik F. Vollherbst ◽  
Markus A. Möhlenbruch ◽  
...  

Abstract Purpose Diabetes is associated with vascular dysfunction potentially impairing collateral recruitment in acute ischemic stroke. This retrospective study aimed at analyzing the impact of diabetes on collateralization assessed on dynamic CTA. Methods Collaterals were retrospectively assessed on CT perfusion–derived dynamic CTA according to the mCTA score by Menon in a cohort of patients with an acute occlusion of the M1 segment or carotid T. The extent of collateral circulation was related to the history of diabetes and to admission blood glucose and HbA1c levels. Results Two hundred thirty-nine patients were included. The mCTA collateral score was similar in patients with diabetes (median 3, interquartile range 3–4) and without diabetes (median 4, interquartile range 3–4) (P = 0.823). Diabetes was similarly frequent in patients with good (18.8%), intermediate (16.1%), and poor collaterals (16.0%) (P = 0.355). HbA1c was non-significantly higher in patients with poor collaterals (6.3 ± 1.5) compared to patients with intermediate (6.0 ± 0.9) and good collaterals (5.8 ± 0.9) (P = 0.061). Blood glucose levels were significantly higher in patients with poor compared to good collaterals (mean 141.6 vs. 121.8 mg/dl, P = 0.045). However, there was no significant difference between good and intermediate collaterals (mean 121.8 vs. 129.5 mg/dl, P = 0.161) as well as between intermediate and poor collaterals (129.5 vs. 141.6 mg/dl, P = 0.161). Conclusion There was no statistically significant difference among patients with good, intermediate, and poor collaterals regarding the presence of diabetes or HbA1c level on admission. However, stroke patients with poor collaterals tend to have higher blood glucose and HbA1c levels.


2020 ◽  
Vol 24 (4) ◽  
pp. 195-202
Author(s):  
Javad Mehrabani ◽  
Soodabeh Bagherzadeh ◽  
Abuzar Jorbonian ◽  
Eisa Khaleghi-Mamaghani ◽  
Maryam Taghdiri ◽  
...  

Background and Study Aim. During exercise, the effects of music on the performance have been previously evaluated. However, the superiority of the type of music and during recovery is not yet clear. Therefore the aim of this study was to determine the impact of music with a spicy and light beat on changes in lactate levels, blood pressure, heart rate, and appetite during the recovery period after the endurance swimming. Material and Methods. Thirteen healthy young girls participate in three control and experimental sessions. The participants performed a swimming. Immediately after swimming, they listened to music. Also, evaluations before and after (several times) swimming were performed.Results. Five minutes after swimming there was also a significant difference between the non-sound group with the music groups (p<0.05). Two and 5 minutes after swimming, there was a significant difference between the spicy and light music groups compared to the non-sound group. There was a significant difference between spicy and light music groups at time 10, 15 and 25 minutes. In the 25 minutes after the swim, reducing the heart rate in light music was more than spicy. Also, 10 minutes after swimming, the spicy music group could not cope with the increase in heart rate (p<0.05). There was a significant difference between the two music groups in minutes 5, 10 and 15 after swimming (p<0.05).Conclusions. listening to light music during recovery from endurance swimming was associated with decreased lactate levels and heart rate, but listening to spicy music increased heart rate and desire for food.


Sign in / Sign up

Export Citation Format

Share Document