scholarly journals Impact of the Beta-Glucan Test on Management of Intensive Care Unit Patients at Risk for Invasive Candidiasis

2020 ◽  
Vol 58 (6) ◽  
Author(s):  
Antonios Kritikos ◽  
Julien Poissy ◽  
Antony Croxatto ◽  
Pierre-Yves Bochud ◽  
Jean-Luc Pagani ◽  
...  

ABSTRACT The 1,3-beta-d-glucan (BDG) test is used for the diagnosis of invasive candidiasis (IC) in intensive care units (ICUs). However, its utility for patient management is unclear. This study assessed the impact of BDG test results on therapeutic decisions. This was a single-center observational study conducted in an ICU over two 6-month periods. All BDG test requests for the diagnosis of IC were analyzed. Before the second period, the ICU physicians received a pocket card instruction (algorithm) for targeted BDG testing in high-risk patients. The performance of the BDG test for IC diagnosis was assessed, as well as its impact on antifungal (AF) prescription. Overall, 72 patients had ≥1 BDG test, and 14 (19%) patients had an IC diagnosis. The BDG test results influenced therapeutic decisions in 41 (57%) cases. The impact of the BDG test was positive in 30 (73%) of them, as follows: AF abstention/interruption following a negative BDG result (n = 27), and AF initiation/continuation triggered by a positive BDG test result and subsequently confirmed IC (n = 3). In 10 (24%) cases, a positive BDG test result resulted in AF initiation/continuation with no further evidence of IC. A negative BDG result and AF abstention with subsequent IC diagnosis were observed in one case. The positive predictive value (PPV) of BDG was improved if testing was restricted to the algorithm’s indications (80% versus 36%, respectively). However, adherence to the algorithm was low (26%), and no benefit of the intervention was observed. The BDG result had an impact on therapeutic decisions in more than half of the cases, which consisted mainly of safe AF interruption/abstention. Targeted BDG testing in high-risk patients improves PPV but is difficult to achieve in ICU.

PLoS ONE ◽  
2021 ◽  
Vol 16 (11) ◽  
pp. e0257941
Author(s):  
Claudia de Souza Gutierrez ◽  
Katia Bottega ◽  
Stela Maris de Jezus Castro ◽  
Gabriela Leal Gravina ◽  
Eduardo Kohls Toralles ◽  
...  

Background Practical use of risk predictive tools and the assessment of their impact on outcome reduction is still a challenge. This pragmatic study of quality improvement (QI) describes the preoperative adoption of a customised postoperative death probability model (SAMPE model) and the evaluation of the impact of a Postoperative Anaesthetic Care Unit (PACU) pathway on the clinical deterioration of high-risk surgical patients. Methods A prospective cohort of 2,533 surgical patients compared with 2,820 historical controls after the adoption of a quality improvement (QI) intervention. We carried out quick postoperative high-risk pathways at PACU when the probability of postoperative death exceeded 5%. As outcome measures, we used the number of rapid response team (RRT) calls within 7 and 30 postoperative days, in-hospital mortality, and non-planned Intensive Care Unit (ICU) admission. Results Not only did the QI succeed in the implementation of a customised risk stratification model, but it also diminished the postoperative deterioration evaluated by RRT calls on very high-risk patients within 30 postoperative days (from 23% before to 14% after the intervention, p = 0.05). We achieved no survival benefits or reduction of non-planned ICU. The small group of high-risk patients (13% of the total) accounted for the highest proportion of RRT calls and postoperative death. Conclusion Employing a risk predictive tool to guide immediate postoperative care may influence postoperative deterioration. It encouraged the design of pragmatic trials focused on feasible, low-technology, and long-term interventions that can be adapted to diverse health systems, especially those that demand more accurate decision making and ask for full engagement in the control of postoperative morbi-mortality.


BMJ Open ◽  
2017 ◽  
Vol 7 (12) ◽  
pp. e018322
Author(s):  
Jez Fabes ◽  
William Seligman ◽  
Carolyn Barrett ◽  
Stuart McKechnie ◽  
John Griffiths

ObjectiveTo develop a clinical prediction model for poor outcome after intensive care unit (ICU) discharge in a large observational data set and couple this to an acute post-ICU ward-based review tool (PIRT) to identify high-risk patients at the time of ICU discharge and improve their acute ward-based review and outcome.DesignRetrospective patient cohort of index ICU admissions between June 2006 and October 2011 receiving routine inpatient review. Prospective cohort between March 2012 and March 2013 underwent risk scoring (PIRT) which subsequently guided inpatient ward-based review.SettingTwo UK adult ICUs.Participants4212 eligible discharges from ICU in the retrospective development cohort and 1028 patients included in the prospective intervention cohort.InterventionsMultivariate analysis was performed to determine factors associated with poor outcome in the retrospective cohort and used to generate a discharge risk score. A discharge and daily ward-based review tool incorporating an adjusted risk score was introduced. The prospective cohort underwent risk scoring at ICU discharge and inpatient review using the PIRT.OutcomesThe primary outcome was the composite of death or readmission to ICU within 14 days of ICU discharge following the index ICU admission.ResultsPIRT review was achieved for 67.3% of all eligible discharges and improved the targeting of acute post-ICU review to high-risk patients. The presence of ward-based PIRT review in the prospective cohort did not correlate with a reduction in poor outcome overall (P=0.876) or overall readmission but did reduce early readmission (within the first 48 hours) from 4.5% to 3.6% (P=0.039), while increasing the rate of late readmission (48 hours to 14 days) from 2.7% to 5.8% (P=0.046).ConclusionPIRT facilitates the appropriate targeting of nurse-led inpatient review acutely after ICU discharge but does not reduce hospital mortality or overall readmission rates to ICU.


2019 ◽  
Vol 19 (5) ◽  
pp. 363-369
Author(s):  
Ashley Albert ◽  
Sophy Mangana ◽  
Mary R. Nittala ◽  
Toms Vengaloor Thomas ◽  
Lacey Weatherall ◽  
...  

Blood ◽  
2008 ◽  
Vol 112 (11) ◽  
pp. 95-95 ◽  
Author(s):  
Prashant Kapoor ◽  
Shaji Kumar ◽  
Rafael Fonseca ◽  
Martha Q. Lacy ◽  
Thomas E Witzig ◽  
...  

Abstract Background: Multiple myeloma (MM) is a heterogeneous disease with very divergent outcomes that are dictated in a large part by specific cytogenetic abnormalities, as well as other prognostic factors such as the proliferative rate of marrow plasma cells. Prognostic systems incorporating these factors have shown clinical utility in identifying high-risk patients, and are increasingly being utilized for treatment decision-making. However, the prognostic relevance of these factors may change with the application of novel therapies. The objective of this study was to determine the impact of risk-stratification (incorporating plasma cell metaphase cytogenetics, interphase fluorescent in-situ hybridization (FISH) and the slide-based plasma cell labeling index (PCLI)) in a cohort of patients with newly diagnosed MM treated initially with lenalidomide + dexamethasone (Rev-Dex). Methods: From March 2004 to November 2007, 100 consecutive patients treated with Rev (25mg/day) on days 1 through 21 of a 4-week cycle in combination with dexamethasone as initial therapy for newly diagnosed myeloma, were identified. High-risk MM was defined as presence of any one or more of the following: hypodiploidy, monoallelic loss of chromosome 13 or its long arm (by metaphase cytogenetics only), deletion of p53 (locus 17p13) or PCLI ≥ 3% or immunoglobulin heavy chain (IgH) translocations, t(4;14) (p16.3;q32) or t(14;16)(q32;q23) on FISH. PFS and OS survival estimates were created using the Kaplan Meier method, and compared by log-rank tests. Results: The median estimated follow-up of the entire cohort (N=100) was 36 months. The median PFS was 31 months; the median OS has not been reached. The 2- and 3-year OS estimates were 93% and 83%, respectively. 16% patients were deemed high-risk by at least one of the 3 tests (cytogenetics, FISH or PCLI). Response rates (PR or better) were 81% versus 89% in the high-risk and standard risk groups, respectively, P=NS; corresponding values for CR plus VGPR rates were 38% and 45% respectively. The median PFS was 18.5 months in high-risk patients compared to 37 months in the standard-risk patients (n=84), P<0.001(Figure). Corresponding values for TTP were 18.5 months and 36.5 months, respectively, P=<0.001. OS was not statistically significant between the two groups; 92% 2-year OS was noted in both the groups. Overall, 95 patients had at least one of the 3 tests to determine risk, while 55 patients could be adequately stratified based on the availability of all the 3 tests, or at least one test result that led to their inclusion in the high-risk category. The significant difference in PFS persisted even when the analysis was restricted to the 55 patients classified using this stringent criterion; 18.5 months vs. 36.5 months in the high-risk and standard- risk groups respectively; P<0.001. In a separate analysis, patients who underwent SCT before the disease progression were censored on the date of SCT to negate its effect, and PFS was still inferior in the high-risk group (p=0.002). Conclusion: The TTP and PFS of high-risk MM patients are inferior to that of the standard-risk patients treated with Rev-Dex, indicating that the current genetic and proliferation-based risk-stratification model remains prognostic with novel therapy. However, the TTP, PFS, and OS obtained in high-risk patients treated with Rev-Dex in this study is comparable to overall results in all myeloma patients reported in recent phase III trials. In addition, no significant impact of high-risk features on OS is apparent so far. Longer follow-up is needed to determine the impact of risk stratification on the OS of patients treated with Rev-Dex. Figure Figure


Blood ◽  
2015 ◽  
Vol 126 (23) ◽  
pp. 4530-4530
Author(s):  
Thomas Gregory Knight ◽  
Joshua F. Zeidner ◽  
Naim U Rashid ◽  
Matthew C Foster

Abstract BACKGROUND: At a large academic teaching hospital, there are a variety of physicians and midlevel providers at the point of initial contact, and the extent of supervision of specifically trained oncology personnel may vary based on time of admission. Patients with acute leukemia may present with high risk disease processes that must be recognized and require prompt intervention to reduce both morbidity and short-term mortality. This is a retrospective review of the delivery of care at admission and key clinical outcomes for high risk patients presenting with acute myeloid leukemia (AML) or acute lymphoblastic leukemia (ALL) based on time of admission. The hypothesis of this study was that high risk patients with AML or ALL admitted overnight may have significant delays in management of the complications of acute leukemia with subsequent increases in morbidity and short-term mortality. METHODS: An institutional electronic database was queried to identify patients with ICD9 codes specific for AML/ALL. Inclusion criteria consisted of adults >18 years admitted to a single institution from 2010-2013. Key clinical data were then abstracted from the electronic medical records including lab values, time of admission (Daytime: 7am-8pm vs Nightime: 8pm-7am), and specific clinically important outcomes (time to specific therapy, time to chemotherapy, length of stay, ICU length of stay, organ failure, and mortality). Patients were categorized as high risk if they met established criteria requiring specific intervention [hyperleukocytosis defined as WBC >50 10^9/L, hyperuricemia defined as uric acid >8 mg/dL, and clinical suspicion for acute promyleocytic leukemia (APL)]. Variables with binary outcomes were tested for association with overnight admission using Fisher's exact test. All other variables were tested using the Wilcoxon two-group test. RESULTS: Between 2010 and 2013, 161 patients with AML/ALL were included in our analysis. Of those, 66 were classified as high risk (Table 1). In the high risk patients there were no significant differences in time to intervention based on time of admission including patients presenting with hyperleukocytosis and time to hydroxyurea administration (p=.32), patients presenting with hyperuricemia and time to allopurinol administration (p=.71) or rasburicase administration (p=.22), and in time to tretinoin (ATRA) administration in patients presenting with APL (p=.23). Time to definitive chemotherapy was significantly less for high risk patients admitted overnight (overnight median=48 hours, day median=56 hours, p=.042). However, rates of mechanical ventilation (p=.09), vasopressor usage (p=.37), and renal failure (p=.43) appeared similar between the groups. Additionally, length of stay (p=.83) and ICU length of stay (p=.44) was not significantly different for the two groups. 30-day mortality did not statistically differ between the two groups (overnight=19.4%, daytime=20%, p=.57). CONCLUSIONS: This is the first comprehensive analysis of the impact of the time of admission of acute leukemia patients at an academic tertiary cancer hospital, to our knowledge. Interestingly, nighttime admissions did not appear to significantly impact time to key clinical interventions or clinical outcomes in high risk patients admitted with acute leukemia. Although time to definitive chemotherapy was found to be significantly less in patients admitted overnight, confounding variables such as severity of illness at the time of admission may have impacted this analysis, and 30-day mortality rates were similar. Overall, this data supports the triage of patients with newly diagnosed or suspected acute leukemia to tertiary care centers as soon as possible. Table 1. Baseline Characteristics of High Risk Patients Age at Diagnosis Number % <50 31 47.0 50-64 24 36.3 65+ 11 16.7 Sex Male 38 57.6 Diagnosis Number % AML (Excluding APL) 37 56.1 APL 18 27.2 ALL 11 16.7 High Risk Features Hyperleukocytosis 42 63.6 Hyperuricemia 20 30.3 APL 18 27.2 >1 High Risk Feature 66 100.0 Initial Point of Contact Number % Referring Hospital 45 68.2 Admission Time Number % Day Shift (7a-8p) 30 45.5 Night Shift (8p-7a) 36 54.5 Admission Location Number % Oncology Inpatient Service 53 80.3 Internal Medicine Inpatient Service 2 3.0 Medical ICU 11 16.7 Disclosures Foster: Celgene: Research Funding.


2014 ◽  
Vol 10 (2) ◽  
pp. 187-196 ◽  
Author(s):  
Florentina E. Sileanu ◽  
Raghavan Murugan ◽  
Nicole Lucko ◽  
Gilles Clermont ◽  
Sandra L. Kane-Gill ◽  
...  

2003 ◽  
Vol 47 (8) ◽  
pp. 2492-2498 ◽  
Author(s):  
Alexander A. Padiglione ◽  
Rory Wolfe ◽  
Elizabeth A. Grabsch ◽  
Di Olden ◽  
Stephen Pearson ◽  
...  

ABSTRACT Accurate assessment of the risk factors for colonization with vancomycin-resistant enterococci (VRE) among high-risk patients is often confounded by nosocomial VRE transmission. We undertook a 15-month prospective cohort study of adults admitted to high-risk units (hematology, renal, transplant, and intensive care) in three teaching hospitals that used identical strict infection control and isolation procedures for VRE to minimize nosocomial spread. Rectal swab specimens for culture were regularly obtained, and the results were compared with patient demographic factors and antibiotic exposure data. Compliance with screening was defined as “optimal” (100% compliance) or “acceptable” (minor protocol violations were allowed, but a negative rectal swab specimen culture was required within 1 week of becoming colonized with VRE). Colonization with VRE was detected in 1.56% (66 of 4,215) of admissions (0.45% at admission and 0.83% after admission; the acquisition time was uncertain for 0.28%), representing 1.91% of patients. No patients developed infection with VRE. The subsequent rate of new acquisition of VRE was 1.4/1,000 patient days. Renal units had the highest rate (3.23/1,000 patient days; 95% confidence interval [CI], 1.54 to 6.77/1,000 patient days). vanB Enterococcus faecium was the most common species (71%), but other species included vanB Enterococcus faecalis (21%), vanA E. faecium (6%), and vanA E. faecalis (2%). The majority of isolates were nonclonal by pulsed-field gel electrophoresis analysis. Multivariate analysis of risk factors in patients with an acceptable screening suggested that being managed by a renal unit (hazard ratio [HR] compared to the results for patients managed in an intensive care unit, 4.6; 95% CI, 1.2 to 17.0 [P = 0.02]) and recent administration of either ticarcillin-clavulanic acid (HR, 3.6; 95% CI, 1.1 to 11.6 [P = 0.03]) or carbapenems (HR, 2.8; 95% CI, 1.0, 8.0 [P = 0.05]), but not vancomycin or broad-spectrum cephalosporins, were associated with acquisition of VRE. The relatively low rates of colonization with VRE, the polyclonal nature of most isolates, and the possible association with the use of broad-spectrum antibiotics are consistent with either the endogenous emergence of VRE or the amplification of previously undetectable colonization with VRE among high-risk patients managed under conditions in which the risk of nosocomial acquisition was minimized.


Sign in / Sign up

Export Citation Format

Share Document