Perioperative hair removal: A review of best practice and a practice improvement opportunity

2018 ◽  
Vol 28 (6) ◽  
pp. 159-166
Author(s):  
Maureen Spencer ◽  
Marsha Barnden ◽  
Helen Boehm Johnson ◽  
Loretta Litz Fauerbach ◽  
Denise Graham ◽  
...  

The current practice of perioperative hair removal reflects research-driven changes designed to minimize the risk of surgical wound infection. An aspect of the practice which has received less scrutiny is the clean-up of the clipped hair. This process is critical. The loose fibers represent a potential infection risk because of the micro-organisms they can carry, but their clean-up can pose a logistical problem because of the time required to remove them. Research has demonstrated that the most commonly employed means of clean-up, the use of adhesive tape or sticky mitts, can be both ineffective and time-consuming in addition to posing an infection risk from cross-contamination. Recently published research evaluating surgical clippers fitted with a vacuum-assisted hair collection device highlights the potential for significant practice improvement in the perioperative hair removal clean-up process. These improvements include not only further mitigation of potential infection risk but also substantial OR time and cost savings.

2010 ◽  
Vol 5 (1) ◽  
pp. 20 ◽  
Author(s):  
Tim A Fischell ◽  

Coronary artery stenting has evolved substantially since the first use of coronary stenting as an adjunct to balloon angioplasty in the early 1990s. The performance (and particularly the deliverability) of coronary stents has improved such that coronary stenting is now the primary mode of revascularisation for percutaneous coronary interventions (PCIs) in more than 95% of cases. The new Svelte™ stent-on-a-wire (SOAW) delivery system represents one of the first substantive innovations in stent delivery systems (SDS) in more than a decade. This SDS uses a shapeable ‘fixed wire’ as an integral part of the SDS. This allows a significant reduction in SDS profile (~0.029 inches) compared with conventional monorail or over-the-wire SDS. This SOAW SDS is intended to facilitate direct stenting. It has the potential to provide substantial procedural cost savings by eliminating the need for a coronary guidewire and balloon pre-dilatation and/or post-dilatation, and by reducing contrast use and the time required to complete the procedure. The SOAW system is compatible with 5Fr guiding catheters, and may reduce the need for closure devices, facilitate stenting via the radial approach and (potentially) reduce bleeding risks. In conclusion, the Svelte SOAW SDS represents a new very-low-profile balloon-expandable SDS that should promote direct stenting in PCIs. The efficiency and small profile of this SDS may allow procedural cost savings, a reduction in procedure time and a reduced risk of bleeding complications. These theoretical advantages will need to be demonstrated in clinical trials.


Author(s):  
Asad E Patanwala ◽  
Sujita W Narayan ◽  
Curtis E Haas ◽  
Ivo Abraham ◽  
Arthur Sanders ◽  
...  

Abstract Disclaimer In an effort to expedite the publication of articles related to the COVID-19 pandemic, AJHP is posting these manuscripts online as soon as possible after acceptance. Accepted manuscripts have been peer-reviewed and copyedited, but are posted online before technical formatting and author proofing. These manuscripts are not the final version of record and will be replaced with the final article (formatted per AJHP style and proofed by the authors) at a later time. Purpose Cost-avoidance studies of pharmacist interventions are common and often the first type of study conducted by investigators to quantify the economic impact of clinical pharmacy services. The purpose of this primer is to provide guidance for conducting cost-avoidance studies pertaining to clinical pharmacy practice. Summary Cost-avoidance studies represent a paradigm conceptually different from traditional pharmacoeconomic analysis. A cost-avoidance study reports on cost savings from a given intervention, where the savings is estimated based on a counterfactual scenario. Investigators need to determine what specifically would have happened to the patient if the intervention did not occur. This assessment can be fundamentally flawed, depending on underlying assumptions regarding the pharmacists’ action and the patient trajectory. It requires careful identification of the potential consequence of nonaction, as well as probability and cost assessment. Given the uncertainty of assumptions, sensitivity analyses should be performed. A step-by-step methodology, formula for calculations, and best practice guidance is provided. Conclusions Cost-avoidance studies focused on pharmacist interventions should be considered low-level evidence. These studies are acceptable to provide pilot data for the planning of future clinical trials. The guidance provided in this article should be followed to improve the quality and validity of such investigations.


1997 ◽  
Vol 35 (11-12) ◽  
pp. 249-252 ◽  
Author(s):  
G. J. Medema ◽  
M. Bahar ◽  
F. M. Schets

Oocysts of Cryptosporidium parvum can survive for several months in surface water, one of the main factors determining their success in environmental transmission and thus their health hazard via water. Several factors in the environment, e.g. temperature, presence of predators and exo-enzymes will probably influence oocyst survival. The high persistence of oocysts may also limit the value of traditional faecal indicator bacteria. The aim of this study was to determine the rate at which C parvum oocysts, E coli, faecal enterococci and C perfringens spores die in surface water and the influence of temperature and the presence of autochthonous (micro)organisms on the die-off rate. Microcosms with autoclaved river water were inoculated with the organisms. Microcosms with untreated river water were inoculated with concentrated primary effluent containing the bacteria and with C parvum oocysts. Microcosms were incubated at 5°C or 15°C at 100rpm. Viability of oocysts was monitored by in vitro excystation and dye-exclusion; viability of the bacteria was determined on appropriate selective media. When pseudo first-order die-off kinetics were assumed, the die-off rate of oocysts at 5°C was 0.010 log10/d and at 15°C, 0.006–0.024 log10/d. These rates underestimate die-off since oocyst disintegration was not accounted for. Incubation in autoclaved or untreated water did influence the die-off rate of oocysts at 15°C but not at 5°C. The die-off rate of E coli and enterococci was faster in the non-sterile river water than in autoclaved water at both temperatures. At 15°C, E coli (and possibly E faecium) even multiplied in autoclaved water. In untreated river water, the die-off of E coli and enterococci was approximately 10x faster than die-off of oocysts but die-off rates of C perfringens were lower than those of oocysts. As for oocysts, die-off of the bacteria and spores was faster at 15°C than at 5°C. Oocysts are very persistent in river water: the time required for a 10x reduction in viability being 40–160d at 15°C and 100d at 5°C. Biological/biochemical activity influenced oocyst survival at 15°C and survival of both vegetative bacteria at 5 and 15°C. The rapid die-off of E coli and enterococci makes them less suitable as indicators of oocyst presence in water. As C perfringens survived longer in untreated river water than oocysts, it may prove useful as an indicator of the presence of C parvum.


2019 ◽  
Vol 40 (6) ◽  
pp. 668-673 ◽  
Author(s):  
Jasmine R. Marcelin ◽  
Charlotte Brewer ◽  
Micah Beachy ◽  
Elizabeth Lyden ◽  
Tammy Winterboer ◽  
...  

AbstractObjective:To evaluate the impact of a hard stop in the electronic health record (EHR) on inappropriate gastrointestinal pathogen panel testing (GIPP).Design:We used a quasi-experimental study to evaluate testing before and after the implementation of an EHR alert to stop inappropriate GIPP ordering.Setting:Midwest academic medical center.Participants:Hospitalized patients with diarrhea for which GIPP testing was ordered, between January 2016 through March 2017 (period 1) and April 2017 through June 2018 (period 2).Intervention:A hard stop in the EHR prevented clinicians from ordering a GIPP more than once per admission or in patients hospitalized for >72 hours.Results:During period 1, 1,587 GIPP tests were ordered over 212,212 patient days, at a rate of 7.48 per 1,000 patient days. In period 2, 1,165 GIPP tests were ordered over 222,343 patient days, at a rate of 5.24 per 1,000 patient days. The Poisson model estimated a 30% reduction in total GIPP ordering rates between the 2 periods (relative risk, 0.70; 95% confidence interval [CI], 0.63–0.78; P < .001). The rate of inappropriate tests ordered decreased from 21.5% to 4.9% between the 2 periods (P < .001). The total savings calculated factoring only GIPP orders that triggered the hard stop was ∼$67,000, with potential savings of $168,000 when factoring silent best-practice alert data.Conclusions:A simple hard stop alert in the EHR resulted in significant reduction of inappropriate GIPP testing, which was associated with significant cost savings. Clinicians can practice diagnostic stewardship by avoiding ordering this test more than once per admission or in patients hospitalized >72 hours.


2019 ◽  
Vol 19 (1) ◽  
Author(s):  
Kelly Rushton ◽  
Claire Fraser ◽  
Judith Gellatly ◽  
Helen Brooks ◽  
Peter Bower ◽  
...  

Abstract Background Psychological treatment delivered by telephone is recommended by the National Institute for Health and Care Excellence (NICE) for mild to moderate depression and anxiety, and forms a key part of the Improving Access to Psychological Therapy (IAPT) programme in the UK. Despite evidence of clinical effectiveness, patient engagement is often not maintained and psychological wellbeing practitioners (PWPs) report lacking confidence and training to deliver treatment by telephone. This study aimed to explore the perspectives of professional decision makers (both local and national) on the barriers and facilitators to the implementation of telephone treatment in IAPT. Methods Sixteen semi-structured qualitative telephone interviews and one focus group were carried out with decision makers (n = 21) who were involved locally and nationally in policy, practice and research. The interviews and focus group were coded thematically, and then mapped onto the four core constructs of Normalisation Process Theory (NPT). Results The use of telephone for psychological treatment was universally recognised amongst participants as beneficial for improving patient choice and access to treatment. However, at service level, motives for the implementation of telephone treatments are often misaligned with national objectives. Pressure to meet performance targets has become a key driver for the use of telephone treatment, with promises of increased efficiency and cost savings. These service-focussed objectives challenge the integration of telephone treatments, and PWP acceptance of telephone treatments as non-inferior to face-to-face. Ambivalence among a workforce often lacking the confidence to deliver telephone treatments leads to reluctance among PWPs to ‘sell’ treatments to a patient population who are not generally expecting treatment in this form. Conclusions Perceptions of a need to ‘sell’ telephone treatment in IAPT persist from top-level decision makers down to frontline practitioners, despite their conflicting motives for the use of telephone. The need for advocacy to highlight the clinical benefit of telephone treatment, along with adequate workforce support and guidance on best practice for implementation is critical to the ongoing success and sustainability of telephone treatment in primary care mental health programmes.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S806-S807
Author(s):  
Cindy L Hoegg ◽  
Katie L Williams ◽  
Eric Shelov ◽  
Talene A Metjian ◽  
Ana Maria Cardenas ◽  
...  

Abstract Background Clinical decision support for Clostridioides difficile infection (CDI) diagnostics reduces inappropriate testing, leading to decreased need for isolation and antibiotic use. Our institution utilized manual discontinuation by laboratory staff of CDI testing for inappropriate specimens, including formed stool and age < 1 year. We aimed to assess the financial impact of instituting a CDI best practice alert at a quaternary care children’s hospital. Methods A multidisciplinary team mapped inappropriate testing criteria identified from literature review with discrete fields in our electronic health record (EHR, EpicCare) to design an alert. The exclusion criteria identified included: (1) age < 1 year; (2) positive C. difficile test within past 14 days; (3) less than or equal to 3 unformed stools in past 24 hours; (4) current receipt of CDI-directed therapy; or (5) laxative use or barium exposure in prior 48 hours. 6 months of data prior to implementation were reviewed to estimate impact of the alert. At implementation, any exclusion criteria detected in the EHR at the time of order entry triggered an alert to deter CDI testing. Cost estimates for averted tests (Quick Check Complete Assay/Illumigene) included cost of test ($50), cost of isolation/personal protective equipment ($159/day), and cost of treatment with oral vancomycin in false-positives ($2250/treatment course). Results In a 6-month pre-implementation period, 586 tests for CDI were ordered; of which, 23% were identified by our criteria as inappropriate. During the first 3 months of alert implementation, 256 tests were ordered, of which 105 (41%) caused the alert to fire. Of those, 56 tests were not ordered, for a 22% reduction in testing. Laboratory staff continued to manually stop tests not meeting criteria, such as patient age <1 year when possible. Based on avoidance of testing, use of PPE, and 10 day antibiotic treatment for false-positives (assumed 25% by literature review), this translated to cost savings of $69,916, and an annual cost savings of $279,664. Conclusion Implementation of an alert for select patients using a bioinformatics algorithm reduced inappropriate CDI testing. Clinical decision support for CDI can lead to substantial cost savings for both antibiotic use and isolation precautions. Disclosures All authors: No reported disclosures.


1995 ◽  
Vol 10 (4) ◽  
pp. 232-237 ◽  
Author(s):  
Thomas Manix ◽  
Michael R. Gunderson ◽  
Geoffrey C. Garth

AbstractIntroduction:Previous evaluations of prehospital devices intended for spinal immobilization have focused on the device's ability to restrict motion only. This study defines six relevant criteria for evaluation of cervical immobilization device (CID) performance.Objectives:To suggest relevant criteria for evaluation and use available technology to improve measurements for performance testing of prehospital-care devices.Methods:Six parameters (motion restriction, access, ease of application, environmental performance, radiolucency, and storage size) were used to evaluate three types of CIDs: Device A—a single-use corrugated board; Device B—a reusable foam-block CID; and Device C—hospital towels and adhesive tape. To test motion restriction, the most frequently compared parameters for immobilization devices, 20 volunteers were asked to move their heads and necks through a series of motions (flexion, extension, lateral bending and rotation). Their movements were videotaped, still images of each movement were generated, and the degrees of deflection recorded from these still images. To ensure a consistent level of force, electromyography (EMG) of the sternodydomastoid and extensor muscles was employed.Results:Data were produced for each parameter and presented for comparison. The use of video to determine deflection proved to be a useful and highly accurate (±1°) method for measurement. The use of EMG technology enabled force to be controlled indirectly when the subjects used moderate levels of exertion. Overall, Devices A and C restricted motion better than Device B. Although Device C required the shortest time for application, it took the longest to prepare for application. The total time required for preparation and application of A and B essentially were equivalent, with A requiring no preparation time but taking the longest for application, and B having an intermediate interval for application. Device A allowed for the best examination of the head and neck. No differences were detected in performance in extreme environmental conditions or in radiolucency for cervical spine X-ray examinations. Device A consumed the smallest storage volume, B the greatest storage volume, and C an intermediate volume substantially greater than that required for A.Conclusion:Device evaluation should include examination of all relevant performance parameters using the most accurate and meaningful methods possible.


2020 ◽  
Vol 49 (Supplement_1) ◽  
pp. i9-i10
Author(s):  
U Okoli ◽  
S Chimhau ◽  
B Nagyova ◽  
A Sahni ◽  
S Amin ◽  
...  

Abstract Introduction Care home residents often have multiple, chronic conditions and are receiving complex treatment regimes. Polypharmacy and medication errors are common. The frequency and quality of medication reviews is variable with limited general practice (GP) capacity to carry out comprehensive reviews. The initiative used a care home pharmacist, technician, geriatrician and GPs to tackle these issues on an individual and care home level. The objective being to ensure the safe and effective use of medicines for all care home residents. NICE guideline [NG56] recommends reducing pharmacological treatment burden for adults with multimorbidity at risk of adverse drug events such as unplanned hospital admissions. A study by Dilles et al1 found adverse drug reactions in 60% of residents. Methods A new interdisciplinary model of care was delivered in a 120 bedded Buckinghamshire care home. Clinical Commissioning Group pharmacist, general practitioners and pharmacy technician reviewed medication for all residents. The most complex individuals were reviewed by the geriatrician and if needed by other multidisciplinary team members specialist. Results Overall 115 medications were stopped for 109 residents, with 31 interventions to reduce falls risk and 19 interventions on medication at high risk2 of causing admission. Total cost savings on medicines optimisation, medicines waste and non-elective admission prevented was £35,211. Residents’ care plans were updated to reflect best practice standards. Conclusions Future direction of this project focuses on system wide improvements to promote interdisciplinary healthcare professionals work in care homes. The success of this integrated model of care has enabled recurrent funding of pharmacist by the local county council and an additional 42 geriatrician sessions into Buckinghamshire care homes. References 1. Dilles T, Vander Stichele R, Van Bortel L, Elseviers M. Journal of American Medical Directors Association 2013; 14: 371–6. 2. Pirmohamed M, et al. Br Med J 2004; 329: 15–9 61.


2014 ◽  
Vol 60 (No. 4) ◽  
pp. 159-173 ◽  
Author(s):  
K. Janda ◽  
P. Zetek

Agricultural output in developing countries still represents a substantial part of the GDP. This ratio has actually increased in some areas such as the Latin America. As such, there is an increasing importance of microfinance institutions (MFIs) focusing on the activities associated with agriculture and encouraging entrepreneurship in agriculture and in the rural communities in general. The contribution of microfinance institutions consists mainly in providing special-purpose loans, usually without collateral. However, questions exist as to the magnitude and the adequate level of risk of providing micro-credit loans in relation to the interest rates being charged. We review two main approaches to setting interest rates in the MFIs. One approach takes the view that interest rates should be set at a high level due to the excessive risk that these institutions undertake. The second approach is to convince the public of the possibility of reducing these rates through cost savings, increased efficiency, and sharing best practice, etc. Subsequently we econometrically analyse the impact of macroeconomic factors on the microfinance interest rates in Latin America and the Caribbean. We show that these results depend on the chosen indicator of interest rate. &nbsp; &nbsp;


2020 ◽  
pp. bmjmilitary-2020-001402 ◽  
Author(s):  
Danny Epstein ◽  
R Strashewsky ◽  
A Furer ◽  
A M Tsur ◽  
J Chen ◽  
...  

IntroductionEndotracheal intubation is required in many emergency, trauma and prehospital scenarios. Endotracheal tube (ETT) fixation must be stable and quick to apply to enable rapid evacuation and patient transport. This study compares performance times of three common ETT securement techniques which are practical for out-of-hospital and combat scenarios.MethodsWe compared the time required by military medics to complete ETT fixation in three techniques—fixation of a wide gauze roll wrapped twice around the head and tied twice around the ETT (GR), using a Thomas Tube Holder (TH) and using a pre-tied non-adhesive tape (PT). 300 military medics were randomised to apply one technique each on a manikin, and time to completion was recorded.Results300 ETTs were successfully fixated by 300 military medics. Median times to complete ETT fixation by PT and TH techniques were 24 s (IQR (19 to 31) and (IQR 20 to 33), respectively). Both were significantly shorter to apply than the GR technique, with a median time of 57 s (IQR 47 to 81), p<0.001.ConclusionsIn time critical situations such as combat, severe trauma, mass casualties and whenever rapid evacuation might improve the clinical outcome, using a faster fixation technique such as Thomas Tube Holder or a pre-tied non-adhesive tape might enable faster evacuation than the use of traditional endotracheal tube fixation techniques.


Sign in / Sign up

Export Citation Format

Share Document