scholarly journals Noninterruptive Clinical Decision Support Decreases Ordering of Respiratory Viral Panels during Influenza Season

2020 ◽  
Vol 11 (02) ◽  
pp. 315-322
Author(s):  
Cameron Escovedo ◽  
Douglas Bell ◽  
Eric Cheng ◽  
Omai Garner ◽  
Alyssa Ziman ◽  
...  

Abstract Objective A growing body of evidence suggests that testing for influenza virus alone is more appropriate than multiplex respiratory viral panel (RVP) testing for general populations of patients with respiratory tract infections. We aimed to decrease the proportion of RVPs out of total respiratory viral testing ordered during influenza season. Methods We implemented two consecutive interventions: reflex testing for RVPs only after a negative influenza test, and noninterruptive clinical decision support (CDS) including modifications of the computerized physician order entry search behavior and cost display. We conducted an interrupted time series of RVPs and influenza polymerase chain reaction tests pre- and postintervention, and performed a mixed-effects logistic regression analysis with a primary outcome of proportion of RVPs out of total respiratory viral tests. The primary predictor was the intervention period, and covariates included the provider, clinical setting, associated diagnoses, and influenza incidence. Results From March 2013 to April 2019, there were 24,294 RVPs and 26,012 influenza tests (n = 50,306). Odds of ordering an RVP decreased during the reflex testing period (odds ratio: 0.432, 95% confidence interval: 0.397–0.469), and decreased more dramatically during the noninterruptive CDS period (odds ratio: 0.291, 95% confidence interval: 0.259–0.327). Discussion The odds of ordering an RVP were 71% less with the noninterruptive CDS intervention, which projected 4,773 fewer RVPs compared with baseline. Assuming a cost equal to Medicare reimbursement rates for RVPs and influenza tests, this would generate an estimated averted cost of $1,259,474 per year. Conclusion Noninterruptive CDS interventions are effective in reducing unnecessary and expensive testing, and avoid typical pitfalls such as alert fatigue.

2018 ◽  
Vol 39 (6) ◽  
pp. 737-740 ◽  
Author(s):  
Gregory R. Madden ◽  
Ian German Mesner ◽  
Heather L. Cox ◽  
Amy J. Mathers ◽  
Jason A. Lyman ◽  
...  

We hypothesized that a computerized clinical decision support tool for Clostridium difficile testing would reduce unnecessary inpatient tests, resulting in fewer laboratory-identified events. Census-adjusted interrupted time-series analyses demonstrated significant reductions of 41% fewer tests and 31% fewer hospital-onset C. difficile infection laboratory-identified events following this intervention.Infect Control Hosp Epidemiol 2018;39:737–740


2019 ◽  
Author(s):  
Liyuan Tao ◽  
Chen Zhang ◽  
Lin Zeng ◽  
Shengrong Zhu ◽  
Nan Li ◽  
...  

BACKGROUND Clinical decision support systems (CDSS) are an integral component of health information technologies and can assist disease interpretation, diagnosis, treatment, and prognosis. However, the utility of CDSS in the clinic remains controversial. OBJECTIVE The aim is to assess the effects of CDSS integrated with British Medical Journal (BMJ) Best Practice–aided diagnosis in real-world research. METHODS This was a retrospective, longitudinal observational study using routinely collected clinical diagnosis data from electronic medical records. A total of 34,113 hospitalized patient records were successively selected from December 2016 to February 2019 in six clinical departments. The diagnostic accuracy of the CDSS was verified before its implementation. A self-controlled comparison was then applied to detect the effects of CDSS implementation. Multivariable logistic regression and single-group interrupted time series analysis were used to explore the effects of CDSS. The sensitivity analysis was conducted using the subgroup data from January 2018 to February 2019. RESULTS The total accuracy rates of the recommended diagnosis from CDSS were 75.46% in the first-rank diagnosis, 83.94% in the top-2 diagnosis, and 87.53% in the top-3 diagnosis in the data before CDSS implementation. Higher consistency was observed between admission and discharge diagnoses, shorter confirmed diagnosis times, and shorter hospitalization days after the CDSS implementation (all <italic>P</italic>&lt;.001). Multivariable logistic regression analysis showed that the consistency rates after CDSS implementation (OR 1.078, 95% CI 1.015-1.144) and the proportion of hospitalization time 7 days or less (OR 1.688, 95% CI 1.592-1.789) both increased. The interrupted time series analysis showed that the consistency rates significantly increased by 6.722% (95% CI 2.433%-11.012%, <italic>P</italic>=.002) after CDSS implementation. The proportion of hospitalization time 7 days or less significantly increased by 7.837% (95% CI 1.798%-13.876%, <italic>P</italic>=.01). Similar results were obtained in the subgroup analysis. CONCLUSIONS The CDSS integrated with BMJ Best Practice improved the accuracy of clinicians’ diagnoses. Shorter confirmed diagnosis times and hospitalization days were also found to be associated with CDSS implementation in retrospective real-world studies. These findings highlight the utility of artificial intelligence-based CDSS to improve diagnosis efficiency, but these results require confirmation in future randomized controlled trials.


2020 ◽  
pp. 001857872093145
Author(s):  
Alyssa Teehan ◽  
Christopher Burke ◽  
Quentin Minson

Purpose: Procalcitonin (PCT) may be an effective biomarker in the management of lower respiratory tract infections (LRTI) when combined with antimicrobial stewardship support. We assessed the impact of a PCT protocol with clinical pharmacy support for LRTI using a clinical decision support system (CDSS) for monitoring. Methods: This was a single-center retrospective cohort study conducted at a large, nonteaching hospital in Nashville, TN. All patients who met eligibility requirements and were initiated on the PCT protocol for a suspected LRTI between February and March 2018 were included and matched to historical control patients from 2016 to 2017 on a 1:1 basis based on antibiotics, indication, and time of year. Results: During this 2-month period, a total of 126 patients met eligibility requirements for inclusion in the PCT group and were matched to historical control patients. Patients in the PCT group received decreased median antibiotic days of therapy (DOT) compared to controls (11 vs 14, P = .004). There was no change in median length of stay (LOS) between groups. The acceptance rate for patient-specific antibiotic de-escalation recommendations from the clinical pharmacist was 62.5%. Conclusion: PCT protocols that utilize clinical pharmacist interpretation and a CDSS may be an effective intervention of the antimicrobial stewardship program (ASP) for decreasing antibiotic DOT for LRTI.


2020 ◽  
Vol 7 (10) ◽  
Author(s):  
Catherine Liu ◽  
Kristine Lan ◽  
Elizabeth M Krantz ◽  
H Nina Kim ◽  
Jacqlynn Zier ◽  
...  

Abstract Background Inappropriate testing for Clostridioides difficile leads to overdiagnosis of C difficile infection (CDI). We determined the effect of a computerized clinical decision support (CCDS) order set on C difficile polymerase chain reaction (PCR) test utilization and clinical outcomes. Methods This study is an interrupted time series analysis comparing C difficile PCR test utilization, hospital-onset CDI (HO-CDI) rates, and clinical outcomes before and after implementation of a CCDS order set at 2 academic medical centers: University of Washington Medical Center (UWMC) and Harborview Medical Center (HMC). Results Compared with the 20-month preintervention period, during the 12-month postimplementation of the CCDS order set, there was an immediate and sustained reduction in C difficile PCR test utilization rates at both hospitals (HMC, −28.2% [95% confidence interval {CI}, −43.0% to −9.4%], P = .005; UWMC, −27.4%, [95% CI, −37.5% to −15.6%], P &lt; .001). There was a significant reduction in rates of C difficile tests ordered in the setting of laxatives (HMC, −60.8% [95% CI, −74.3% to −40.1%], P &lt; .001; UWMC, −37.3%, [95% CI, −58.2% to −5.9%], P = .02). The intervention was associated with an increase in the C difficile test positivity rate at HMC (P = .01). There were no significant differences in HO-CDI rates or in the proportion of patients with HO-CDI who developed severe CDI or CDI-associated complications including intensive care unit transfer, extended length of stay, 30-day mortality, and toxic megacolon. Conclusions Computerized clinical decision support tools can improve C difficile diagnostic test stewardship without causing harm. Additional studies are needed to identify key elements of CCDS tools to further optimize C difficile testing and assess their effect on adverse clinical outcomes.


2020 ◽  
Author(s):  
Lars Müller ◽  
Aditya Srinivasan ◽  
Shira R Abeles ◽  
Amutha Rajagopal ◽  
Francesca J Torriani ◽  
...  

BACKGROUND There is a pressing need for digital tools that can leverage big data to help clinicians select effective antibiotic treatments in the absence of timely susceptibility data. Clinical presentation and local epidemiology can inform therapy selection to balance the risk of antimicrobial resistance and patient risk. However, data and clinical expertise must be appropriately integrated into clinical workflows. OBJECTIVE The aim of this study is to leverage available data in electronic health records, to develop a data-driven, user-centered, clinical decision support system to navigate patient safety and population health. METHODS We analyzed 5 years of susceptibility testing (1,078,510 isolates) and patient data (30,761 patients) across a large academic medical center. After curating the data according to the Clinical and Laboratory Standards Institute guidelines, we analyzed and visualized the impact of risk factors on clinical outcomes. On the basis of this data-driven understanding, we developed a probabilistic algorithm that maps these data to individual cases and implemented iBiogram, a prototype digital empiric antimicrobial clinical decision support system, which we evaluated against actual prescribing outcomes. RESULTS We determined patient-specific factors across syndromes and contexts and identified relevant local patterns of antimicrobial resistance by clinical syndrome. Mortality and length of stay differed significantly depending on these factors and could be used to generate heuristic targets for an acceptable risk of underprescription. Combined with the developed <i>remaining risk</i> algorithm, these factors can be used to inform clinicians’ reasoning. A retrospective comparison of the iBiogram-suggested therapies versus the actual prescription by physicians showed similar performance for low-risk diseases such as urinary tract infections, whereas iBiogram recognized risk and recommended more appropriate coverage in high mortality conditions such as sepsis. CONCLUSIONS The application of such data-driven, patient-centered tools may guide empirical prescription for clinicians to balance morbidity and mortality with antimicrobial stewardship.


Pharmacy ◽  
2021 ◽  
Vol 9 (3) ◽  
pp. 136
Author(s):  
Andrew B. Watkins ◽  
Trevor C. Van Schooneveld ◽  
Craig G. Reha ◽  
Jayme Anderson ◽  
Kelley McGinnis ◽  
...  

In 2018, a clinical decision support (CDS) tool was implemented as part of a “daily checklist” for frontline pharmacists to review patients on antibiotics with procalcitonin (PCT) <0.25 mcg/L. This study used a retrospective cohort design to assess change in antibiotic use from pharmacist interventions after this PCT alert in patients on antibiotics for lower respiratory tract infections (LRTI). The secondary outcome was antibiotic days of therapy (DOT), with a subgroup analysis examining antibiotic use and the length of stay (LOS) in patients with a pharmacist intervention. From 1/2019 to 11/2019, there were 165 alerts in 116 unique patients on antibiotics for LRTI. Pharmacists attempted interventions after 34 (20.6%) of these alerts, with narrowing spectrum or converting to oral being the most common interventions. Pharmacist interventions prevented 125 DOT in the hospital. Vancomycin was the most commonly discontinued antibiotic with an 85.3% use reduction in patients with interventions compared to a 27.4% discontinuation in patients without documented intervention (p = 0.0156). The LOS was similar in both groups (median 6.4 days vs. 7 days, p = 0.81). In conclusion, interventions driven by a CDS tool for pharmacist-driven antimicrobial stewardship in patients with a normal PCT resulted in fewer DOT and significantly higher rates of vancomycin discontinuation.


2020 ◽  
Author(s):  
Emmanuel Chazard ◽  
Augustin Boudry ◽  
Patrick Emmanuel Beeler ◽  
Olivia Dalleur ◽  
Hervé Hubert ◽  
...  

BACKGROUND Drug-drug interactions (DDIs) involving vitamin K antagonists (VKAs) constitute an important cause of in-hospital morbidity and mortality. However, the list of potential DDIs is long; the implementation of all these interactions in a clinical decision support system (CDSS) results in over-alerting and alert fatigue – limiting the benefits provided by the CDSS. OBJECTIVE To estimate the probability of occurrence of INR changes for each DDI rule, via the reuse of electronic health record (EHRs). METHODS An 8-year, exhaustive, population-based, historical cohort study including: French community hospital, a group of Danish community hospitals, and a Bulgarian hospital. The study database included 156,893 stays. After filtering against two criteria (at least one VKA administration and at least one INR laboratory result), the final analysis covered 4047 stays. The Exposure to any of the 145 drugs known to interact with VKA was tracked and analyzed if at least 3 patients were concerned. The main outcomes are VKA potentiation (defined as an INR≥5) and VKA inhibition (defined as an INR≤1.5). Groups were compared using Fisher’s exact test and logistic regression, and the results were expressed as an odds ratio [95% confidence interval] RESULTS The drugs known to interact with VKAs either did not have a statistically significant association regarding of the outcome (47 drug administrations and 14 discontinuations) or were associated with significant reduction in risk of its occurrence (odds ratio <1 for 18 administrations and 21 discontinuations). CONCLUSIONS The probabilities of outcomes obtained were not those expected on the basis of our current body of pharmacological knowledge. The results do not cast doubt on our current pharmacological knowledge per se but do challenge the commonly accepted idea whereby this knowledge alone should be used to define when a DDI alert should be displayed. Real-life probabilities should also be considered during the filtration of DDI alerts by CDSSs, as proposed in SPC-CDSS (statistically prioritized and contextualized CDSS). However, these probabilities may differ from one hospital to another and so should probably be calculated locally. CLINICALTRIAL


2014 ◽  
Vol 35 (9) ◽  
pp. 1147-1155 ◽  
Author(s):  
Charles A. Baillie ◽  
Mika Epps ◽  
Asaf Hanish ◽  
Neil O. Fishman ◽  
Benjamin French ◽  
...  

ObjectiveTo evaluate the usability and effectiveness of a computerized clinical decision support (CDS) intervention aimed at reducing the duration of urinary tract catheterizations.DesignRetrospective cohort study.SettingAcademic healthcare system.Patients.All adult patients admitted from March 2009 through May 2012.Intervention.A CDS intervention was integrated into a commercial electronic health record. Providers were prompted at order entry to specify the indication for urinary catheter insertion. On the basis of the indication chosen, providers were alerted to reassess the need for the urinary catheter if it was not removed within the recommended time. Three time periods were examined: baseline, after implementation of the first intervention (stock reminder), and after a second iteration (homegrown reminder). The primary endpoint was the usability of the intervention as measured by the proportion of reminders through which providers submitted a remove urinary catheter order. Secondary endpoints were the urinary catheter utilization ratio and the rate of hospital-acquired catheter-associated urinary tract infections (CAUTIs).Result.The first intervention displayed limited usability, with 2% of reminders resulting in a remove order. Usability improved to 15% with the revised reminder. The catheter utilization ratio declined over the 3 time periods (0.22, 0.20, and 0.19, respectively; P < .001), as did CAUTIs per 1,000 patient-days (0.84, 0.70, and 0.51, respectively; P < .001).ConclusionsA urinary catheter removal reminder system was successfully integrated within a healthcare system’s electronic health record. The usability of the reminder was highly dependent on its user interface, with a homegrown version of the reminder resulting in higher impact than a stock reminder.Infect Control Hosp Epidemiol 2014;35(9):1147-1155


Sign in / Sign up

Export Citation Format

Share Document