scholarly journals Improved empiric antibiotic prescribing for common infectious disease diagnoses using order sets with built-in clinical decision support in the emergency department

Author(s):  
Roslyn M. Seitz ◽  
Zanthia Wiley ◽  
Christele F. Francois ◽  
Tim P. Moran ◽  
Jonathan D. Rupp ◽  
...  
2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S409-S409
Author(s):  
Julia K Yarahuan ◽  
Brandon Hunter ◽  
Devin Nadar ◽  
Nitin Gujral ◽  
Andrew M Fine ◽  
...  

Abstract Background Institutional antibiograms play a key role in antimicrobial stewardship and may provide a venue for clinical decision support. Our institution recently transitioned our paper antibiogram to an enhanced digital antibiogram with antibiotic recommendations for common pediatric infections. The objectives of this study were (1) to improve the accessibility of our institutional antibiogram through a digital platform and (2) to improve trainee confidence when selecting empiric antibiotics by integrating clinical decision support. Methods The digital antibiogram was developed and evaluated at a tertiary children’s hospital. The tool was developed iteratively over one year by our innovation and digital health accelerator with recommendations for empiric antibiotic selection provided by experts in pediatric infectious diseases (see Figure 1 for example). Usability pilot testing was performed with a group of ordering providers and the tool was released internally in October 2018. A paired pre- and post- implementation survey evaluated residents’ perceptions of the accessibility of the paper vs. digital antibiogram and their confidence when selecting empiric antibiotics. Data were analyzed by Fisher exact test. Results During the 3 months after release, the digital antibiogram was accessed 1014 times with similar proportions of views for susceptibility data, dosing, and empiric antibiotic recommendations. Of the 31 pediatric residents who responded to both pre- and post- implementation surveys, only 59% had access to a copy of the paper antibiogram. Following release of the digital antibiogram, residents referred to antibiotic susceptibilities more frequently (P < 0.05, Figure 2) and were more frequently more confident when selecting the correct antibiotic dose (P < 0.01, Figure 3). See Figure 4 for dosing recommendation example. Conclusion Providing antibiotic susceptibility and dosing recommendations digitally improved accessibility and resident confidence during antibiotic prescribing. Our digital tool provides a successful platform for displaying the antibiotic data and recommendations that enable appropriate antibiotic use. Disclosures All authors: No reported disclosures.


Author(s):  
Emily S. Patterson ◽  
Giavanna N. DiLoreto ◽  
Rohith Vanam ◽  
Erinn Hade ◽  
Courtney Hebert

Human factors engineering can enhance software usefulness and usability. We describe a multi-method approach to improve clinical decision support (CDS) for antibiotic stewardship. We employed a heuristic review to generate recommendations to improve the usability of a prototype CDS to support empiric antibiotic prescribing in the hospital setting. We then engaged in a design improvement cycle in collaboration with software programmers, which resulted in additional enhancements to our prototype. Finally, we used the revised prototype during three walkthrough demonstration interviews with physician and pharmacist subject matter experts. These walkthrough interviews generated recommendations to improve the interface, functionality, and tailoring for groups of users. We discuss common elements of the recommendations for models for using clinical decision support in general.


2020 ◽  
Vol 1 (3) ◽  
pp. 214-221
Author(s):  
Foster R. Goss ◽  
Kelly Bookman ◽  
Michelle Barron ◽  
Daniel Bickley ◽  
Brady Landgren ◽  
...  

2020 ◽  
Vol 41 (S1) ◽  
pp. s368-s368
Author(s):  
Mary Acree ◽  
Kamaljit Singh ◽  
Urmila Ravichandran ◽  
Jennifer Grant ◽  
Gary Fleming ◽  
...  

Background: Empiric antibiotic selection is challenging and requires knowledge of the local antibiogram, national guidelines and patient-specific factors, such as drug allergy and recent antibiotic exposure. Clinical decision support for empiric antibiotic selection has the potential to improve adherence to guidelines and improve patient outcomes. Methods: At NorthShore University HealthSystem, a 4-hospital, 789 bed system, an automated point-of-care decision support tool referred to as Antimicrobial Stewardship Assistance Program (ASAP) was created for empiric antibiotic selection for 4 infectious syndromes: pneumonia, skin and soft-tissue infections, urinary tract infection, and intra-abdominal infection. The tool input data from the electronic health record, which can be modified by any user. Using an algorithm created with electronic health record data, antibiogram data, and national guidelines, the tool produces an antibiotic recommendation that can be ordered via a link to order entry. If the tool identifies a patient with a high likelihood for a multidrug-resistant infection, a consultation by an infectious diseases specialist is recommended. Utilization of the tool and associated outcomes were evaluated from July 2018 to May 2019. Results: The ASAP tool was executed by 140 unique, noninfectious diseases providers 790 times. The tool was utilized most often for pneumonia (194 tool uses), followed by urinary tract infection (166 tool uses). The most common provider type to use the tool was an internal medicine hospitalist. The tool increased adherence to the recommended antibiotic regimen for each condition. Antibiotic appropriateness was assessed by an infectious diseases physician. Antibiotics were considered appropriate when they were similar to the antibiotic regimen recommended by the ASAP. Inappropriate antibiotics were classified as broad or narrow. When antibiotic coverage was appropriate, hospital length of stay was statistically significantly shorter (4.8 days vs 6.8 days for broad antibiotics vs 7.4 days for narrow antibiotics; P < .01). No significant differences were identified in mortality or readmission. Conclusions: A clinical decision support tool in the electronic health record can improve adherence to recommended empiric antibiotic therapy. Use of appropriate antibiotics recommended by such a tool can reduce hospital length of stay.Funding: NoneDisclosures: None


JAMIA Open ◽  
2021 ◽  
Vol 4 (2) ◽  
Author(s):  
Ellen Kerns ◽  
Russell McCulloh ◽  
Sarah Fouquet ◽  
Corrie McDaniel ◽  
Lynda Ken ◽  
...  

Abstract Objective To determine utilization and impacts of a mobile electronic clinical decision support (mECDS) on pediatric asthma care quality in emergency department and inpatient settings. Methods We conducted an observational study of a mECDS tool that was deployed as part of a multi-dimensional, national quality improvement (QI) project focused on pediatric asthma. We quantified mECDS utilization using cumulative screen views over the study period in the city in which each participating site was located. We determined associations between mECDS utilization and pediatric asthma quality metrics using mixed-effect logistic regression models (adjusted for time, site characteristics, site-level QI project engagement, and patient characteristics). Results The tool was offered to clinicians at 75 sites and used on 286 devices; cumulative screen views were 4191. Children’s hospitals and sites with greater QI project engagement had higher cumulative mECDS utilization. Cumulative mECDS utilization was associated with significantly reduced odds of hospital admission (OR: 0.95, 95% CI: 0.92–0.98) and higher odds of caregiver referral to smoking cessation resources (OR: 1.08, 95% CI: 1.01–1.16). Discussion We linked mECDS utilization to clinical outcomes using a national sample and controlling for important confounders (secular trends, patient case mix, and concomitant QI efforts). We found mECDS utilization was associated with improvements in multiple measures of pediatric asthma care quality. Conclusion mECDS has the potential to overcome barriers to dissemination and improve care on a broad scale. Important areas of future work include improving mECDS uptake/utilization, linking clinicians’ mECDS usage to clinical practice, and studying mECDS’s impacts on other common pediatric conditions.


2020 ◽  
Author(s):  
Nicolas Delvaux ◽  
Veerle Piessens ◽  
Tine De Burghgraeve ◽  
Pavlos Mamouris ◽  
Bert Vaes ◽  
...  

Abstract Background Inappropriate laboratory test ordering poses an important burden for healthcare. Clinical decision support systems (CDSS) have been cited as promising tools to improve laboratory test ordering behavior. The objectives of this study were to evaluate the effects of an intervention that integrated a clinical decision support service into a computerized physician order entry (CPOE) on the appropriateness and volume of laboratory test ordering, and on diagnostic error in primary care.Methods This study was a pragmatic, cluster randomized, open label, controlled clinical trial. Setting 280 general practitioners (GPs) from 72 primary care practices in Belgium. Patients Patients aged ≥18 years with a laboratory test order for at least one of 17 indications; cardiovascular disease management, hypertension, check-up, chronic kidney disease (CKD), thyroid disease, type 2 diabetes mellitus, fatigue, anemia, liver disease, gout, suspicion of acute coronary syndrome (ACS), suspicion of lung embolism, rheumatoid arthritis, sexually transmitted infections (STI), acute diarrhea, chronic diarrhea, and follow-up of medication. Interventions The CDSS was integrated into a computerized physician order entry (CPOE) in the form of evidence-based order sets that suggested appropriate tests based on the indication provided by the general physician. Measurements The primary outcome of the ELMO study was the proportion of appropriate tests over the total number of ordered tests and inappropriately not-requested tests. Secondary outcomes of the ELMO study included diagnostic error, test volume and cascade activities.Results CDSS increased the proportion of appropriate tests by 0.21 (95% CI 0.16 - 0.26, p<.0001) for all tests included in the study. GPs in the CDSS arm ordered 7 (7.15 (95% CI 3.37 - 10.93, p=.0002)) tests fewer per panel. CDSS did not increase diagnostic error. The absolute difference in proportions was a decrease of 0.66% (95% CI 1.4% decrease - 0.05% increase) in possible diagnostic error.Conclusions A CDSS in the form of order sets, integrated within the CPOE improved appropriateness and decreased volume of laboratory test ordering without increasing diagnostic error. Trial Registration Clinicaltrials.gov Identifier: NCT02950142, registered on October 25, 2016Funding source This study was funded through the Belgian Health Care Knowledge Centre (KCE) Trials Programme agreement KCE16011.


Sign in / Sign up

Export Citation Format

Share Document