Education and devices to prevent blood and body fluid exposures

2019 ◽  
Vol 70 (1) ◽  
pp. 38-44 ◽  
Author(s):  
S Cheetham ◽  
H Ngo ◽  
J Liira ◽  
E Lee ◽  
C Pethrick ◽  
...  

Abstract Background Healthcare workers are at risk of blood and body fluid exposures (BBFE) while delivering care to patients. Despite recent technological advances such as safety-engineered devices (SEDs), these injuries continue to occur in healthcare facilities worldwide. Aims To assess the impact of an education and SEDs workplace programme on rates of reported exposures. Methods A retrospective cohort study, utilizing interrupted time series analysis to examine reported exposures between 2005 and 2015 at a 600-bed hospital in Perth, Western Australia. The hospital wards were divided into four cohorts. Results A total of 2223 records were available for analysis. The intervention was most effective for the first cohort, with significant improvements both short-term (reduction of 12 (95% CI 7–17) incidents per 1000 full-time equivalent (FTE) hospital staff) and long-term (reduction of 2 (CI 0.6–4) incidents per 1000 FTE per year). Less significant or consistent impacts were observed for the other three cohorts. Overall, the intervention decreased BBFE exposure rates at the hospital level from 19 (CI 18–20) incidents per 1000 FTE pre-intervention to 11 (CI 10–12) incidents per 1000 FTE post-intervention, a 41% reduction. No exposures resulted in a blood-borne virus infection. Conclusions The intervention was most effective in reducing exposures at a time when incidence rates were increasing. The overall effect was short-term and did not further reduce an already stabilized trend, which was likely due to improved safety awareness and practice, induced by the first cohort intervention.

JAMIA Open ◽  
2021 ◽  
Vol 4 (3) ◽  
Author(s):  
Amber Sieja ◽  
Melanie D Whittington ◽  
Vanessa Paul Patterson ◽  
Katie Markley ◽  
Heather Holmstrom ◽  
...  

Abstract Objective We report the influence of Sprint electronic health record (EHR) training and optimization on clinician time spent in the EHR. Materials and Methods We studied the Sprint process in one academic internal medicine practice with 26 providers. Program offerings included individualized training sessions, and the ability to clean up, fix, or build new EHR tools during the 2-week intervention. EHR usage log data were available for 24 clinicians, and the average clinical full-time equivalent was 0.44. We used a quasi-experimental study design with an interrupted time series specification, with 8 months of pre- and 12 months of post-intervention data to evaluate clinician time spent in the EHR. Results We discovered a greater than 6 h per day reduction in clinician time spent in the EHR at the clinic level. At the individual clinician level, we demonstrated a time savings of 20 min per clinician per day among those who attended at least 2 training sessions. Discussion We can promote EHR time savings for clinicians who engage in robust EHR training and optimization programs. To date, programs have shown a positive correlation between participation and subjective EHR satisfaction, efficiency, or time saved. The impact of EHR training and optimization on objective time savings remains elusive. By measuring time in the EHR, this study contributes to an ongoing conversation about the resources and programs needed to decrease clinician EHR time. Conclusions We have demonstrated that Sprint is associated with time savings for clinicians for up to 6 months. We suggest that an investment in EHR optimization and training can pay dividends in clinician time saved.


2019 ◽  
Vol 82 (06) ◽  
pp. 559-567
Author(s):  
Christina Niedermeier ◽  
Andrea Barrera ◽  
Eva Esteban ◽  
Ivana Ivandic ◽  
Carla Sabariego

Abstract Background In Germany a new reimbursement system for psychiatric clinics was proposed in 2009 based on the § 17d KHG Psych-Entgeltsystem. The system can be voluntary implemented by clinics since 2013 but therapists are frequently afraid it might affect treatment negatively. Objectives To evaluate whether the new system has a negative impact on treatment success by analysing routinely collected data in a Bavarian clinic. Material and methods Aggregated data of 1760 patients treated in the years 2007–2016 was analysed with segmented regression analysis of interrupted time series to assess the effects of the system on treatment success, operationalized with three outcome variables. A negative change in level after a lag period was hypothesized. The robustness of results was tested by sensitivity analyses. Results The percentage of patients with treatment success tends to increase after the new system but no significant change in level was observed. The sensitivity analyses corroborate results for 2 outcomes but when the intervention point was shifted, the positive change in level for the third outcome became significant. Conclusions Our initial hypothesis is not supported. However, the sensitivity analyses disclosed uncertainties and our study has limitations, such as a short observation time post intervention. Results are not generalizable as data of a single clinic was analysed. Nevertheless, we show the importance of collecting and analysing routine data to assess the impact of policy changes on patient outcomes.


2020 ◽  
Author(s):  
Mooketsi Molefi ◽  
John Tlhakanelo ◽  
Thabo Phologolo ◽  
Shimeles G. Hamda ◽  
Tiny Masupe ◽  
...  

Abstract BackgroundPolicy changes are often necessary to contain the detrimental impact of epidemics such as the coronavirus disease (COVID-19). China imposed strict restrictions on movement on January 23rd, 2020.Interrupted time series methods were used to study the impact of the lockdown on the incidence of COVID-19. MethodsThe number of cases of COVID-19 reported daily from January 12thto March 30th, 2020 were extracted from the World Health Organization (WHO) COVID-19 dashboard ArcGIS® and matched to China’s projected population of 1 408 526 449 for 2020 in order to estimate daily incidences. Data were plotted to reflect daily incidences as data points in the series. A deferred interruption point of 6thFebruary was used to allow a 14-day period of diffusion. The magnitude of change and linear trend analyses were evaluated using the itsafunction with ordinary least-squares regression coefficients in Stata® yielding Newey-West standard errors.ResultsSeventy-eight (78) daily incidence points were used for the analysis, with 11(14.10%) before the intervention. There was a daily increase of 163 cases (β=1.16*10-07, p=0.00) in the pre-intervention period. Although there was no statistically significant drop in the number of cases reported daily in the immediate period following 6thFebruary 2020 when compared to the counterfactual (p=0.832), there was a 241 decrease (β=-1.71*10-07, p=0.00) in cases reported daily when comparing the pre-intervention and post-intervention periods. A deceleration of 78(47%) cases reported daily. ConclusionThe lockdown policy managed to significantly decrease the incidence of CoVID-19 in China. Lockdown provides an effective means of curtailing the incidence of COVID-19.


CJEM ◽  
2016 ◽  
Vol 19 (2) ◽  
pp. 96-105 ◽  
Author(s):  
Alexander K. Leung ◽  
Shawn D. Whatley ◽  
Dechang Gao ◽  
Marko Duic

AbstractObjectiveTo study the operational impact of process improvements on emergency department (ED) patient flow. The changes did not require any increase in resources or expenditures.MethodsThis was a 36-month pre- and post-intervention study to evaluate the effect of implementing process improvements at a community ED from January 2010 to December 2012. The intervention comprised streamlining triage by having patients accepted into internal waiting areas immediately after triage. Within the ED, parallel processes unfolded, and there was no restriction on when registration occurred or which health care provider a patient saw first. Flexible nursing ratios allowed nursing staff to redeploy and move to areas of highest demand. Last, demand-based physician scheduling was implemented. The main outcome was length of stay (LOS). Secondary outcomes included time to physician initial assessment (PIA), left-without-being-seen (LWBS) rates, and left-against-medical-advice (LAMA) rates. Segmented regression of interrupted time series analysis was performed to quantify the impact of the intervention, and whether it was sustained.ResultsPatients totalling 251,899 attended the ED during the study period. Daily patient volumes increased 17.3% during the post-intervention period. Post-intervention, mean LOS decreased by 0.64 hours (p<0.005). LOS for non-admitted Canadian Triage and Acuity Scale 2 (-0.58 hours, p<0.005), 3 (-0.75 hours, p<0.005), and 4 (-0.32 hours, p<0.005) patients also decreased. There were reductions in PIA (43.81 minutes, p<0.005), LWBS (35.2%, p<0.005), and LAMA (61.9%, p<0.005).ConclusionA combination of process improvements in the ED was associated with clinically significant reductions in LOS, PIA, LWBS, and LAMA for non-resuscitative patients.


2017 ◽  
Vol 77 (5) ◽  
pp. 684-689 ◽  
Author(s):  
René Lindholm Cordtz ◽  
Samuel Hawley ◽  
Daniel Prieto-Alhambra ◽  
Pil Højgaard ◽  
Kristian Zobbe ◽  
...  

ObjectivesTo study the impact of the introduction of biological disease-modifying anti-rheumatic drugs (bDMARDs) and associated rheumatoid arthritis (RA) management guidelines on the incidence of total hip (THR) and knee replacements (TKR) in Denmark.MethodsNationwide register-based cohort and interrupted time-series analysis. Patients with incident RA between 1996 and 2011 were identified in the Danish National Patient Register. Patients with RA were matched on age, sex and municipality with up to 10 general population comparators (GPCs). Standardised 5-year incidence rates of THR and TKR per 1000 person-years were calculated for patients with RA and GPCs in 6-month periods. Levels and trends in the pre-bDMARD (1996–2001) were compared with the bDMARD era (2003–2016) using segmented linear regression interrupted by a 1-year lag period (2002).ResultsWe identified 30 404 patients with incident RA and 297 916 GPCs. In 1996, the incidence rate of THR and TKR was 8.72 and 5.87, respectively, among patients with RA, and 2.89 and 0.42 in GPCs. From 1996 to 2016, the incidence rate of THR decreased among patients with RA, but increased among GPCs. Among patients with RA, the incidence rate of TKR increased from 1996 to 2001, but started to decrease from 2003 and throughout the bDMARD era. The incidence of TKR increased among GPCs from 1996 to 2016.ConclusionWe report that the incidence rate of THR and TKR was 3-fold and 14-fold higher, respectively among patients with RA compared with GPCs in 1996. In patients with RA, introduction of bDMARDs was associated with a decreasing incidence rate of TKR, whereas the incidence of THR had started to decrease before bDMARD introduction.


2007 ◽  
Vol 28 (10) ◽  
pp. 1196-1201 ◽  
Author(s):  
A. G. Venier ◽  
A. Vincent ◽  
F. L'Hériteau ◽  
N. Floret ◽  
H. Sénéchal ◽  
...  

Objective.To estimate the incidence rate of reported occupational blood and body fluid exposures among French healthcare workers (HCWs).Design.Prospective national follow-up of HCWs from January 1 to December 31, 2004.Setting.University hospitals, hospitals, clinics, local medical centers, and specialized psychiatric centers were included in the study on a voluntary basis.Participants.At participating medical centers, every reported blood and body fluid exposure was documented by the occupational practitioner in charge of the exposed HCW by use of an anonymous, standardized questionnaire.Results.A total of 375 medical centers (15% of French medical centers, accounting for 29% of hospital beds) reported 13,041 blood and body fluid exposures; of these, 9,396 (72.0%) were needlestick injuries. Blood and body fluid exposures were avoidable in 39.1% of cases (5,091 of 13,020), and 52.2% of percutaneous injuries (4,986 of 9,552) were avoidable (5.9% due to needle recapping). Of 10,656 percutaneous injuries, 22.6% occurred during an injection, 17.9% during blood sampling, and 16.6% during surgery. Of 2,065 splashes, 22.6% occurred during nursing activities, 19.1% during surgery, 14.1% during placement or removal of an intravenous line, and 12.0% during manipulation of a tracheotomy tube. The incidence rates of exposures were 8.9 per 100 hospital beds (95% confidence interval [CI], 8.7-9.0 exposures), 2.2 per 100 full-time—equivalent physicians (95% CI, 2.4-2.6 exposures), and 7.0 per 100 full-time—equivalent nurses (95% CI, 6.8-7.2 exposures). Human immunodeficiency virus serological status was unknown for 2,789 (21.4%) of 13,041 patients who were the source of the blood and body fluid exposures.Conclusion.National surveillance networks for blood and body fluid exposures help to better document their characteristics and risk factors and can enhance prevention at participating medical centers.


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S851-S851
Author(s):  
Vagesh Hemmige ◽  
Becky Winterer ◽  
Todd Lasco ◽  
Bradley Lembcke

Abstract Background SARS-COV2 transmission to healthcare personnel (HCP) and hospitalized patients is a significant challenge. Our hospital is a quaternary healthcare system with more than 500 beds and 8,000 HCP. Between April 1 and April 17, 2020, we instituted several infection prevention strategies to limit transmission of SARS-COV2 including universal masking of HCP and patients, surveillance testing every two weeks for high-risk HCP and every week for cluster units, and surveillance testing for all patients on admission and prior to invasive procedures. On July 6, 2020, we implemented universal face shield for all healthcare personnel upon entry to facility. The aim of this study is to assess the impact of face shield policy on SARS-COV2 infection among HCP and hospitalized patients. Figure 1- Interrupted time series Methods The preintervention period (April 17, 2020-July 5, 2020) included implementation of universal face masks and surveillance testing of HCP and patients. The intervention period (July 6, 2020-July 26, 2020) included the addition of face shield to all HCP (for patient encounters and staff-to-staff encounters). We used interrupted time series analysis with segmented regression to examine the effect of our intervention on the difference in proportion of HCP positive for SARS-COV2 (using logistic regression) and HAI (using Poisson regression). We defined significance as p values &lt; 0.05. Results Of 4731 HCP tested, 192 tested positive for SARS-COV2 (4.1%). In the preintervention period, the weekly positivity rate among HCP increased from 0% to 12.9%. During the intervention period, the weekly positivity rate among HCP decreased to 2.3%, with segmented regression showing a change in predicted proportion positive in week 13 (18.0% to 3.7%, p&lt; 0.001) and change in the post-intervention slope on the log odds scale (p&lt; 0.001). A total of 14 HAI cases were identified. In the preintervention period, HAI cases increased from 0 to 5. During the intervention period, HAI cases decreased to 0. There was a change between pre-intervention and post-intervention slope on the log scale was significant (p&lt; 0.01). Conclusion Our study showed that the universal use of face shield was associated with significant reduction in SARS-COV2 infection among HCP and hospitalized patients. Disclosures All Authors: No reported disclosures


2021 ◽  
Author(s):  
Harry L. Hébert ◽  
Daniel R. Morales ◽  
Nicola Torrance ◽  
Blair H. Smith ◽  
Lesley A. Colvin

AbstractBackgroundOpioids are used to treat patients with chronic pain, but their long-term use is associated with harms. In December 2013, SIGN 136 was published, providing a comprehensive evidence-based guideline for the assessment and management of chronic pain in ScotlandAimsThis study aimed to examine the impact of SIGN 136 on opioid prescribing trends and costs across the whole of Scotland.MethodsOpioid prescribing data and average cost per item were obtained from Public Health Scotland. An interrupted time series analysis examined the effects of SIGN 136 publication on the number of items prescribed per 1,000 population per quarter for 29 opioids (or opioid-containing combinations) from 2005 to 2019 inclusive. Exploratory analysis was conducted in NHS Tayside and NHS Fife combined and then up-scaled to all 14 NHS Scotland health boards. A similar approach was also used to assess the effect of SIGN 136 on estimated gross ingredient costs per quarter.ResultsAt six years post-intervention there was a relative reduction in opioid prescribing of 18.8% (95% CI: 16.0-21.7) across Scotland. There was also a relative reduction of 22.8% (95%: 14.9-30.1) in gross ingredient cost nationally. Opioid prescribing increased significantly pre-intervention across all 14 NHS Scotland health boards (2.19 items per 1000 population per quarter), followed by a non-significant change in level and a significant negative change in trend post-intervention (−2.69 items per 1000 population per quarter). Similar findings were observed locally in NHS Tayside and NHS Fife.ConclusionsThe publication of SIGN 136 coincided with a statistically significant reduction in opioid prescribing rates in Scotland and suggests that changes in clinical policy are having a positive effect on prescribing practices in primary care. These prescribing trends appear to be in contrast to the UK as a whole.


2019 ◽  
Vol 19 (1) ◽  
Author(s):  
Kentaro Iwata ◽  
Michihiko Goto

Abstract Background Enterohemorrhagic Escherichia coli (EHEC) is an important pathogen that causes diarrhea, hemorrhagic colitis, and hemolytic uremic syndrome (HUS). After an EHEC outbreak involving uncooked beef, serving raw beef liver dishes at restaurants was completely banned starting on July 1, 2012 in Japan. However, its long-term associations with the incidence rates of EHEC infections have never been assessed by formal interrupted time-series analysis (ITSA). Methods A retrospective cohort study to assess the impact of banning raw beef liver provision at restaurants was conducted. The weekly incidence of asymptomatic and symptomatic EHEC infections, the incidence of HUS, and deaths were extracted from the national reportable diseases database from January 2008 to December 2017. ITSA was conducted to evaluate the impact of banning raw beef liver from July 2012. To account for a potential simultaneous external effect, the additional regulation on raw beef red meat handling (implemented in May 2011) and the seasonality were also incorporated into the model. Results There were 32,179 asymptomatic and 21,250 symptomatic EHEC infections (including 717 HUS cases and 26 deaths) reported during the study period. During the pre-intervention period (before week 27, 2012), there were 0.45 asymptomatic EHEC infections per million-persons per week. The mean post-intervention asymptomatic EHEC infections were 0.51 per million-persons per week. ITSA revealed no baseline trend or change in the intercept and trend (0.002 infections per million-persons per week, 95% Confidence interval − 0.03-0.04, p = 0.93, 1.22, CI -1.96-4.39, p = 0.45, and − 0.006, CI -0.003-0.02, p = 0.68, respectively). For symptomatic EHEC infections, there were 0.30 cases per million per week during the pre-intervention period, and it became 0.33 cases per million per week after the intervention. Time series modeling again did not show a significant baseline trend or changes in the intercept and trend (0.0005, CI -0.02-0.02, p = 0.96, 0.69, CI -1.75-3.12, p = 0.58, and − 0.003, CI -0.02-0.01, p = 0.76, respectively). Conclusion We did not find a statistically significant reduction in the overall incidence rates of both asymptomatic and symptomatic EHEC infections in Japan after implementing measures, including a ban on serving raw beef liver dishes in the restaurant industry.


Sign in / Sign up

Export Citation Format

Share Document