PIVC Best Practices: A Path to Performance Improvement

Author(s):  
Erin Davidson ◽  
Prachi Arora

Highlights Abstract Background: Insertion of peripheral vascular access devices (PIVC) is fundamental to patient care and may affect patient outcomes. Baseline data of PIVC insertions at a large medical center revealed that catheters required multiple insertion attempts, catheter hubs were manipulated to place extension sets, increasing the risk of complications, dwell times did not meet current standards, nurses experienced blood-exposure risk, and overall compliance with the hospital documentation policy was suboptimal. A 3-phase quality improvement project was conducted to address these concerns. Methods: In Phase 1, an assessment of the current state of PIVC insertions and care was conducted using a mixed-methods approach consisting of an observational audit of insertion and maintenance practices, and retrospective chart reviews. In Phase 2, PIVC policies and practices were updated to reflect current standards. A new advanced design PIVC device was adopted, and education was provided to all staff. In Phase 3, the impact of these changes on key PIVC measures was assessed 1 year later. Results: The analysis of the data found several improvements following implementation of an integrated IV catheter system: first-stick success rate increased from 73% to 84%, staff blood exposure was reduced from 46.67% to 0% (P = .01), improper securement of PIVC catheters was reduced from 11% to 0% (P = .002), and documentation compliance rate increased from 68% to 80%. The median PIVC dwell time doubled (from 2 days to 4 days). Conclusion: Changes to policy, practices, and products plus education can improve the PIVC first-stick success, dwell time, documentation, and staff safety.

2021 ◽  
pp. 089719002110222
Author(s):  
Bailey E Eason ◽  
Tyler A Vest ◽  
Katherine D. Mieure ◽  
Danielle Neal ◽  
Jennifer Tryon

Purpose: Controlled substances management is highly regulated, and requires institutions to have processes in place to maintain a closed-loop. This study was conducted to comprehensively evaluate the current state of controlled substances management, propose optimization opportunities, and implement steps to align the medication use process (MUP) to a defined desired state. Methods: This evaluation was conducted in 2 phases. In phase 1, the current state of controlled substances management was assessed in order to develop a gap analysis tool and failure mode and effects analysis (FMEA). In phase 2, a work group was assembled to address opportunities within the FMEA. The work group prioritized opportunities using the risk priority number (RPN), and formulated action steps to align processes with the defined desired state. Results: Through the literature evaluation, a desired state, consisting of 86 segments, was defined and compared with a gap analysis tool. Direct observation of the MUP allowed for development of 13 process maps depicting current state. Of the 86 segments, it was determined the study institution had a compliance rate of 62%. The remaining 38% correlated with 55 actionable process opportunities that were included in the FMEA. To date, 31 of the 55 (56%) opportunities have been successfully addressed by the work group. Conclusion: Use of direct observation to formulate a gap analysis tool and FMEA is an effective modality to evaluate controlled substances processes. These tools allow for pharmacy departments to identify and prioritize opportunities to optimize controlled substances management within an academic medication center


Author(s):  
Ghamar Bitar ◽  
Anthony Sciscione

Objective Despite lack of evidence to support efficacy, activity restriction is one of the most commonly prescribed interventions used for the prevention of preterm birth. We have a departmental policy against the use of activity restriction but many practitioners still prescribe it in an effort to prevent preterm birth. We sought to evaluate the rate and compliance of women who are prescribed activity restriction during pregnancy to prevent preterm birth. Study Design This was a single-site retrospective questionnaire study at a tertiary care, academic affiliated medical center. Women with a history of preterm delivery or short cervix were included. Once patients were identified, each patient was contacted and administered a questionnaire. We assessed the rates of activity restriction prescription and compliance. Secondary outcomes included details regarding activity restriction and treatment in pregnancy. Continuous variables were compared with t-test and categorical variables with Chi-square test. The value p < 0.05 was considered significant. Results Among the 52 women who responded to the questionnaire, 18 reported being placed on activity restriction by a physician, with 1 self-prescribing activity restriction, giving a rate of our primary outcome of 19 of 52 (36.5%). All women reported compliance with prescribed activity restriction (100%). Gestational age at delivery was not different in women placed on activity restriction. Conclusion This questionnaire suggests that approximately one in three high-risk women were placed on activity restriction during their pregnancy despite a departmental policy against its use. The 100% compliance rate in patients placed on activity restriction is a strong reminder of the impact prescribing patterns of physicians can have on patients. Key Points


2021 ◽  
Vol 8 (Supplement_1) ◽  
pp. S182-S182
Author(s):  
Xue Fen Valerie Seah ◽  
Yue Ling Rina Ong ◽  
Wei Ming Cedric Poh ◽  
Shahul Hameed Mohamed Siraj ◽  
Kai-Qian Kam ◽  
...  

Abstract Background Antimicrobial stewardship programs (ASP) aim to improve appropriate antimicrobial use. Post-operative antibiotics are generally not necessary, especially those without surgical site infections risk factors (e.g. obesity). Few studies have described the impact of ASP interventions on patient outcomes especially in unique populations such as obstetrics. This study aims to evaluate the impact of ASP interventions on post-elective caesarean (eLSCS) oral antibiotic prophylaxis use and patient outcomes including SSI rates. Methods This pre-post quasi-experimental study was conducted over 9 months (2 months pre- and 7 months post-intervention) in all women admitted for eLSCS in our institution. Interventions included eLSCS surgical prophylaxis guideline dissemination, where a single antibiotic dose within 60 minutes before skin incision was recommended. Post-eLSCS oral antibiotics was actively discouraged in those without SSI risk factors. This was followed by ASP intervention notes (phase 1) for 3 months, and an additional phone call to the ward team for the next 7 months (phase 2). Phase 3 (next 6 months) constituted speaking to the operating consultant. The primary outcome was post-operative oral antibiotics prescription rates. Secondary outcomes included rates of 30-day post-operative SSI. Results A total of 1751 women was reviewed. Appropriateness of pre-operative antibiotic prophylaxis was 98% in our institution. There were 244 women pre-intervention, 274 in post-intervention phase 1, 658 in phase 2 and 575 in phase 3. Pre-intervention post-eLSCS antibiotic prescribing rates was 82% (200), which reduced significantly post-intervention to 54% (148) in phase 1, 50% (331) in phase 2 and 39% (226) in phase 3 (p&lt; 0.001). There was no significant difference in patients who developed post-operative SSI pre-post intervention (0.8%, 2 of 242 vs. 1.9%, 28 of 1479, p=0.420) and among who received post-operative oral antibiotics compared to those without (1.9%, 17 of 905 vs. 1.5%, 13 of 846, p=0.582). Conclusion ASP interventions can reduce post-eLSCS antibiotic prophylaxis rates without adversely impacting patient safety. Disclosures All Authors: No reported disclosures


2021 ◽  
Author(s):  

In 2014, the Australian Council for Educational Research (ACER) and the Australian Government’s Department of Foreign Affairs and Trade (DFAT) established a partnership under the Global Education Monitoring Centre. Since then, there have been two funding periods: Phase 1 from 2014–2017 and Phase 2 from 2017–2020. Phase 3 will cover 2020–2023. This report documents the completion of Phase 2 funding and describes the shared priorities of DFAT and ACER through the GEM Centre, followed by the objectives and key outcomes of the work program during this period. The outcomes and lessons learned, together with findings from the GEM Centre mid-term review (MTR) in 2019,1 are reflected against the impact and sustainability of the ACER–DFAT partnership. The MTR validated the overall success of the GEM Centre and identified areas for further development, specifically to improve the effectiveness of the partnership. This report concludes with a brief outlook on how these developments will be addressed under Phase 3 of the GEM Centre.


2021 ◽  
Vol 39 (15_suppl) ◽  
pp. e18780-e18780
Author(s):  
Mike Gart ◽  
Hinco J. Gierman ◽  
Daniel P. Petro ◽  
Rushir J. Choksi ◽  
Prateesh Varughese ◽  
...  

e18780 Background:In August 2019 Integra Connect (IC) partnered on a QI with University of Pittsburgh Medical Center (UPMC) to improve outcomes in patients with stage 3 and 4 NSCLC. This report details the findings and interventions in the unresectable stage 3 cohort of the QI. The addition of durvalumab (D) in the PACIFIC trial (Antonia et al. NEJM 2017) after completion of CRT in stage 3 patients who had not progressed showed significant Progression Free Survival and Overall Survival (OS) benefit with Food and Drug Administration approval on 2/16/2018 in this setting. An update (Gray et al. Thoracic Oncol 2020) on 10/14/2019 noted superior OS in patients in whom randomization to D occurred 1-14 days post CRT vs. those with interval 15-42 days (HR 0.43 vs. 0.79). Data suggest that CRT renders tumors more responsive to immunotherapy (McCall et al. Clin Can Res 2018). As part of the QI, we explored the question whether time from CRT to D (TTT) could be shortened. Methods:From the UPMC and IC real-world-data (RWD) databases, we identified 182 patients with Stage 3 unresectable NSCLC treated with CRT between 2/16/18 (D approval) and 11/16/20 for manual chart abstraction. We calculated the TTT from the latest day of radiation or chemotherapy to the first D dose. Time-to-scan (TTS) used a similar methodology. If post-CRT scan data was not found, those patients were excluded from TTS analysis. We captured caregiver perception with surveys and used RWD to determine the proportion of eligible patients treated with D, categorizing the data into 3 successive time periods: Phase 1 (240 days): 2/16/18 approval of D to Gray update 10/14/19, Phase 2 (321 days): 10/15/19 to physician leadership intervention 8/31/20, Phase 3 (76 days): 9/1/20 to 11/16/20. Patients were excluded in phase 3 who started CRT after 11/16/20 to allow for up to 2 months to start D. Our plan included baseline and ongoing monitoring of metrics complemented with physician leadership intervention to address identified gaps in care. Results: Median age of the 182 patients was 68 (range 46-87) with 60% male. Of eligible patients, 121 (66.5%) received at least 1 dose of D. Median TTS improved 16 days from Phase 1 to Phase 3 while TTT concomitantly improved 17 days (Table ). Conclusions: This QI resulted in simultaneous shortening of TTS and TTT following physician intervention with establishment of TTS as a key potential driver of TTT which ultimately may result in improved OS. To do so required overcoming the traditional paradigm of imaging 4-6 weeks post-CRT to capture maximal response with that of early imaging aimed at assuring no progression had occurred. This, as well as proportion treated with D and its resulting duration, plus any subsequent treatments that might indicate relapse, continue to be monitored in a real-time dashboard.[Table: see text]


2016 ◽  
Vol 24 (0) ◽  
Author(s):  
Carolina Justus Buhrer Ferreira Neto ◽  
Caroline Koga Plodek ◽  
Franciny Kossemba Soares ◽  
Rayza Assis de Andrade ◽  
Fernanda Teleginski ◽  
...  

Abstract Objective: to analyze the impact of guidelines regarding errors in medications prescribed for administration through enteral tubes. Method: quantitative study, in three phases, undertaken in internal medicine, neurology and an intensive care unit in a general teaching hospital. In Phase 1, the following was undertaken: a protocol for dilution and unit-dose repackaging and administration for 294 medications via enteral tubes; a decision flowchart; operational-standard procedures for dilution and unit-dose repackaging of oral pharmaceutical forms and for administration of medications through enteral tubes. In phase 2, errors in 872 medications prescribed through enteral tubes, in 293 prescriptions for patients receiving inpatient treatment between March and June, were investigated. This was followed by training of the teams in relation to the guidelines established. In Phase 3, pharmaceutical errors and interventions in 945 medications prescribed through enteral tubes, in 292 prescriptions of patients receiving inpatient treatment between August and September, were investigated prospectively. The data collected, in a structured questionnaire, were compiled in the Microsoft Office Excel(r) program, and frequencies were calculated. Results: 786 errors were observed, 63.9% (502) in Phase 2, and 36.1% (284) in Phase 3. In Phase 3, a reduction was ascertained in the frequency of prescription of medications delivered via enteral tubes, medications which were contraindicated, and those for which information was not available. Conclusion: guidelines and pharmaceutical interventions were determined in the prevention of errors involving medications delivered through enteral tubes.


2012 ◽  
Vol 21 (1) ◽  
pp. e1-e11 ◽  
Author(s):  
Gail Gesin ◽  
Brittany B. Russell ◽  
Andrew P. Lin ◽  
H. James Norton ◽  
Susan L. Evans ◽  
...  

BackgroundThe impact of using a validated delirium screening tool and different levels of education on surgical-trauma intensive care unit (STICU) nurses’ knowledge about delirium is unclear.ObjectivesTo measure the impact of using the Intensive Care Delirium Screening Checklist (ICDSC), with or without a multi-faceted education program, on STICU nurses’ knowledge and perceptions of delirium and their ability to evaluate it correctly.MethodsThe knowledge and perceptions of subject nurses about delirium, and agreement between the independent assessments of delirium by the subject nurse and by a validated judge (who always used the ICDSC), were compared across 3 phases. Phase 1: No delirium screening tool and no education. Phase 2: ICDSC and minimal education (ie, ICDSC validation study only). Phase 3: ICDSC and multifaceted education (ie, pharmacist-led didactic lecture, Web-based module, and nurse-led bedside training).ResultsNurses’ knowledge (mean [SD] score out of 10 points) was similar (P = .08) in phase 1 (6.1 [1.4]) and phase 2 (6.5 [1.4]) but was greater (P = .001) in phase 3 (8.2 [1.4]). Agreement between nurses and the validated judge in the assessment of delirium increased from phase 1 (κ = 0.40) to phase 2 (κ = 0.62) to phase 3 (κ = 0.74). Nurses perceived use of the ICDSC as improving their ability to recognize delirium.ConclusionsUse of a multifaceted education program improves both nurses’ knowledge about delirium and their perceptions about its recognition. Implementation of the ICDSC improves the ability of STICU nurses to evaluate delirium correctly.


2019 ◽  
Vol 6 (10) ◽  
Author(s):  
Tsubasa Akazawa ◽  
Yoshiki Kusama ◽  
Haruhisa Fukuda ◽  
Kayoko Hayakawa ◽  
Satoshi Kutsuna ◽  
...  

Abstract Objective We implemented a stepwise antimicrobial stewardship program (ASP). This study evaluated the effect of each intervention and the overall economic impact on carbapenem (CAR) use. Method Carbapenem days of therapy (CAR-DOT) were calculated to assess the effect of each intervention, and antipseudomonal DOT were calculated to assess changes in use of broad-spectrum antibiotics. We carried out segmented regression analysis of studies with interrupted time series for 3 periods: Phase 1 (infectious disease [ID] consultation service only), Phase 2 (adding monitoring and e-mail feedback), and Phase 3 (adding postprescription review and feedback [PPRF] led by ID specialist doctors and pharmacists). We also estimated cost savings over the study period due to decreased CAR use. Results The median monthly CAR-DOT, per month per 100 patient-days, during Phase 1, Phase 2, and Phase 3 was 5.46, 3.69, and 2.78, respectively. The CAR-DOT decreased significantly immediately after the start of Phase 2, but a major decrease was not observed during this period. Although the immediate change was not apparent after Phase 3 started, CAR-DOT decreased significantly over this period. Furthermore, the monthly DOT of 3 alternative antipseudomonal agents also decreased significantly over the study period, but the incidence of antimicrobial resistance did not decrease. Cost savings over the study period, due to decreased CAR use, was estimated to be US $150 000. Conclusions Adding PPRF on the conventional ASP may accelerate antimicrobial stewardship. Our CAR stewardship program has had positive results, and implementation is ongoing.


2021 ◽  
Vol 11 (24) ◽  
pp. 12004
Author(s):  
Shuo-Chen Chien ◽  
Yen-Po Chin ◽  
Chang-Ho Yoon ◽  
Chun-You Chen ◽  
Chun-Kung Hsu ◽  
...  

Alert dwell time, defined as the time elapsed from the generation of an interruptive alert to its closure, has rarely been used to describe the time required by clinicians to respond to interruptive alerts. Our study aimed to develop a tool to retrieve alert dwell times from a homegrown CPOE (computerized physician order entry) system, and to conduct exploratory analysis on the impact of various alert characteristics on alert dwell time. Additionally, we compared this impact between various professional groups. With these aims, a dominant window detector was developed using the Golang programming language and was implemented to collect all alert dwell times from the homegrown CPOE system of a 726-bed, Taiwanese academic medical center from December 2019 to February 2021. Overall, 3,737,697 interruptive alerts were collected. Correlation analysis was performed for alerts corresponding to the 100 most frequent alert categories. Our results showed that there was a negative correlation (ρ = −0.244, p = 0.015) between the number of alerts and alert dwell times. Alert dwell times were strongly correlated between different professional groups (physician vs. nurse, ρ = 0.739, p < 0.001). A tool that retrieves alert dwell times can provide important insights to hospitals attempting to improve clinical workflows.


PLoS ONE ◽  
2021 ◽  
Vol 16 (9) ◽  
pp. e0257639
Author(s):  
Kelly K. O’Brien ◽  
Aileen M. Davis ◽  
Soo Chan Carusone ◽  
Lisa Avery ◽  
Ada Tang ◽  
...  

Purpose Our aim was to examine the impact of a community-based exercise (CBE) intervention on cardiorespiratory fitness, cardiovascular health, strength, flexibility, and physical activity outcomes among adults living with HIV. Methods We conducted a longitudinal intervention study with community-dwelling adults living with HIV in Toronto, Canada. We measured cardiopulmonary fitness (V̇O2peak (primary outcome), heart rate, blood pressure), strength (grip strength, vertical jump, back extension, push-ups, curl ups), flexibility (sit and reach test), and self-reported physical activity bimonthly across three phases. Phase 1 included baseline monitoring (8 months); Phase 2 included the CBE Intervention (6 months): participants were asked to exercise (aerobic, strength, balance and flexibility training) for 90 minutes, 3 times/week, with weekly supervised coaching at a community-based fitness centre; and Phase 3 included follow-up (8 months) where participants were expected to continue with thrice weekly exercise independently. We used segmented regression (adjusted for baseline age and sex) to assess the change in trend (slope) among phases. Our main estimates of effect were the estimated change in slope, relative to baseline values, over the 6 month CBE intervention. Results Of the 108 participants who initiated Phase 1, 80 (74%) started and 67/80 (84%) completed the intervention and 52/67 (77%) completed the study. Most participants were males (87%), with median age of 51 years (interquartile range (IQR): 45, 59). Participants reported a median of 4 concurrent health conditions in addition to HIV (IQR: 2,7). Participants attended a median of 18/25 (72%) weekly supervised sessions. Change in V̇O2peak attributed to the six-month Phase 2 CBE intervention was 0.56 ml/kg/min (95% Confidence Interval (CI): -1.27, 2.39). Significant effects of the intervention were observed for systolic blood pressure (-5.18 mmHg; 95% CI: -9.66, -0.71), push-ups (2.30 additional push-ups; 95% CI: 0.69, 3.91), curl ups (2.89 additional curl ups; 95% CI: 0.61, 5.17), and sit and reach test (1.74 cm; 95% CI: 0.21, 3.28). More participants engaged in self-reported strength (p<0.001) and flexibility (p = 0.02) physical activity at the end of intervention. During Phase 3 follow-up, there was a significant reduction in trend of benefits observed during the intervention phase for systolic blood pressure (1.52 mmHg/month; 95% CI: 0.67, 2.37) and sit and reach test (-0.42 cm/month; 95% CI: -0.68, -0.16). Conclusion Adults living with HIV who engaged in this six-month CBE intervention demonstrated inconclusive results in relation to V̇O2peak, and potential improvements in other outcomes of cardiovascular health, strength, flexibility and self-reported physical activity. Future research should consider features tailored to promote uptake and sustained engagement in independent exercise among adults living with HIV. ClinicalTrials.gov Identifier NCT02794415. https://clinicaltrials.gov/ct2/show/record/NCT02794415.


Sign in / Sign up

Export Citation Format

Share Document