scholarly journals Evaluation of Factors Influencing Thigh Circumference Measurement in Dogs

2016 ◽  
Vol 1 (2) ◽  
Author(s):  
Felix Michael Duerr ◽  
Ana Luisa Bascuñán ◽  
Nina Kieves ◽  
Clara Goh ◽  
Juliette Hart ◽  
...  

<p class="AbstractSummary"><strong>Objective: </strong>To evaluate inter- and intra-observer variability, influence of hair clipping and laser guidance on canine thigh circumference (TC) measurements amongst observers.<strong></strong></p><p class="AbstractSummary"><strong>Background:</strong> It was our goal to further study the reliability of canine TC measurements as currently performed. For this purpose we designed a cadaveric model that allows for controlled inflation of the thigh resembling increase of muscle mass. We also investigated the impact of novel technologies (laser guidance) and hair clipping on TC measurements in this model. </p><p class="AbstractSummary"><strong>Evidentiary value:</strong> Phase 1 cadaveric study - five long-haired, large breed canine cadavers; Phase 2 clinical study - eight clinically healthy Golden Retrievers. This study should impact clinical research and practice.</p><p class="AbstractSummary"><strong>Methods: </strong>Phase 1 - Canine cadaveric thigh girth was manually expanded to three different levels using a custom, submuscular inflation system before and after hair clipping; Phase 2 - TC of Golden Retrievers was measured with and without laser guidance. TC measurements for both phases were performed by four observers in triplicate resulting in a total of 552 measurements. </p><p class="AbstractSummary"><strong>Results:</strong> Phase 1 - TC measurements before and after hair clipping were significantly different (3.44cm difference, p&lt;0.001). Overall inter-observer and intra-observer variability were 2.26±1.18cm and 0.90±0.61cm, respectively. Phase 2 - Laser guidance nominally improved inter-observer variability (3.34 ±1.09cm versus 4.78 ±2.60cm) but did not affect intra-observer variability (1.14 ±0.66cm versus 1.13 ±0.77cm).</p><p class="AbstractSummary"><strong>Conclusion: </strong>TC measurement is a low fidelity outcome measure with a large inter- and intra-observer variability even under controlled conditions in a cadaveric setting. Current methods of canine TC measurement may not produce a valid outcome measurement. If utilised, hair coat clipping status should be considered and an intra-observer variability of at least 1cm should be assumed when comparing repeated TC measurements. Laser guidance may be helpful to nominally reduce inter-observer variability in settings with multiple observers. Further investigation of alternative methods for canine TC measurement should be pursued.<strong></strong></p><p class="AbstractSummary"><strong>Application:</strong> This information should be considered by everyone utilizing TC measurements as an outcome assessment for clinical or research purposes. </p><br /> <img src="https://www.veterinaryevidence.org/rcvskmod/icons/oa-icon.jpg" alt="Open Access" /> <img src="https://www.veterinaryevidence.org/rcvskmod/icons/pr-icon.jpg" alt="Peer Reviewed" />

2021 ◽  
Vol 33 (1) ◽  
Author(s):  
Joao Gabriel Rosa Ramos ◽  
Sandra Cristina Hernandes ◽  
Talita Teles Teixeira Pereira ◽  
Shana Oliveira ◽  
Denis de Melo Soares ◽  
...  

Abstract Background Clinical pharmacists have an important role in the intensive care unit (ICU) team but are scarce resources. Our aim was to evaluate the impact of on-site pharmacists on medical prescriptions in the ICU. Methods This is a retrospective, quasi-experimental, controlled before-after study in two ICUs. Interventions by pharmacists were evaluated in phase 1 (February to November 2016) and phase 2 (February to May 2017) in ICU A (intervention) and ICU B (control). In phase 1, both ICUs had a telepharmacy service in which medical prescriptions were evaluated and interventions were made remotely. In phase 2, an on-site pharmacist was implemented in ICU A, but not in ICU B. We compared the number of interventions that were accepted in phase 1 versus phase 2. Results During the study period, 8797/9603 (91.6%) prescriptions were evaluated, and 935 (10.6%) needed intervention. In phase 2, there was an increase in the proportion of interventions that were accepted by the physician in comparison to phase 1 (93.9% versus 76.8%, P &lt; 0.001) in ICU A, but there was no change in ICU B (75.2% versus 73.9%, P = 0.845). Conclusion An on-site pharmacist in the ICU was associated with an increase in the proportion of interventions that were accepted by physicians.


2014 ◽  
Vol 2 (1) ◽  
pp. 1-124 ◽  
Author(s):  
Caroline L Watkins ◽  
Stephanie P Jones ◽  
Michael J Leathley ◽  
Gary A Ford ◽  
Tom Quinn ◽  
...  

BackgroundRapid access to emergency stroke care can reduce death and disability by enabling immediate provision of interventions such as thrombolysis, physiological monitoring and stabilisation. One of the ways that access to services can be facilitated is through emergency medical service (EMS) dispatchers. The sensitivity of EMS dispatchers for identifying stroke is < 50%. Studies have shown that activation of the EMSs is the single most important factor in the rapid triage and treatment of acute stroke patients.ObjectivesTo facilitate recognition of stroke by emergency medical dispatchers (EMDs).DesignAn eight-phase mixed-methods study. Phase 1: a retrospective cohort study exploring stroke diagnosis. Phase 2: semi-structured interviews exploring public and EMS interactions. Phases 3 and 4: a content analysis of 999 calls exploring the interaction between the public and EMDs. Phases 5–7: development and implementation of stroke-specific online training (based on phases 1–4). Phase 8: an interrupted time series exploring the impact of the online training.SettingOne ambulance service and four hospitals.ParticipantsPatients arriving at hospital by ambulance with stroke suspected somewhere on the stroke pathway (phases 1 and 8). Patients arriving at hospital by ambulance with a final diagnosis of stroke (phase 2). Calls to the EMSs relating to phase 1 patients (phases 3 and 4). EMDs (phase 7).InterventionsStroke-specific online training package, designed to improve recognition of stroke for EMDs.Main outcome measuresPhase 1: symptoms indicative of a final and dispatch diagnosis of stroke. Phase 2: factors involved in the decision to call the EMSs when stroke is suspected. Phases 3 and 4: keywords used by the public when describing stroke and non-stroke symptoms to EMDs. Phase 8: proportion of patients with a final diagnosis of stroke correctly dispatched as stroke by EMDs.ResultsPhase 1: for patients with a final diagnosis of stroke, facial weakness and speech problems were significantly associated with an EMD code of stroke. Phase 2: four factors were identified – perceived seriousness; seeking and receiving lay or professional advice; caller’s description of symptoms and emotional response to symptoms. Phases 3 and 4: mention of ‘stroke’ or one or more Face Arm Speech Test (FAST) items is much more common in stroke compared with non-stroke calls. Consciousness level was often difficult for callers to determine and/or communicate. Phase 8: there was a significant difference (p = 0.003) in proportions correctly dispatched as stroke – before the training was implemented 58 out of 92 (63%); during implementation of training 42 out of 48 (88%); and after training implemented 47 out of 59 (80%).ConclusionsEMDs should be aware that callers are likely to describe loss of function (e.g. unable to grip) rather than symptoms (e.g. weakness) and that callers using the word ‘stroke’ or describing facial weakness, limb weakness or speech problems are likely to be calling about a stroke. Ambiguities and contradictions in dialogue about consciousness level arise during ambulance calls for suspected and confirmed stroke. The online training package improved recognition of stroke by EMDs. Recommendations for future research include testing the effectiveness of the Emergency Stroke Calls: Obtaining Rapid Telephone Triage (ESCORTT) training package on the recognition of stroke across other EMSs in England; and exploring the impact of the early identification of stroke by call handlers on patient and process outcomes.FundingThe National Institute for Health Research Programme Grants for Applied Research programme.


2019 ◽  
Vol 97 (Supplement_3) ◽  
pp. 81-81
Author(s):  
Alini Veira ◽  
Luan S Santos ◽  
Alicia Fraga ◽  
Paulo Campos ◽  
Raphael Caetano ◽  
...  

Abstract Recent studies have shown that feed intake, nutrient metabolism and utilization may vary during the 24-h circadian period. In this regard, this study aimed at evaluating the impact on performance from the switching of conventional to sequential feeding programs with diets that differ in amino acid content over the day for growing–finishing pigs. Sixty-eight 25-kg (±2.04) BW barrows were assigned to 4 feeding programs (17 animals per treatment): 1) conventional feeding (CONV), in which pigs received 100% of standardized ileal digestible (SID) AA recommendations for the entire day; 2) sequential feeding (SEQ80-120), providing 80% SID AA recommendations from 2400 to 1159 h and 120% from 1200 to 2359 h; 3) sequential feeding (SEQ70-130) providing 70% SID AA recommendations from 2400 to 1159 h and 130% from 1200 to 2359 h; and 4) sequential feeding (SEQ60-140) providing 60% SID AA recommendations from 2400 to 1159 h and 140% from 1200 to 2359 h. The experimental period lasted 82 d and was subdivided in 3 phases: phase 1 (0 to 28 d), phase 2 (29 to 54 d) and phase 3 (55 to 82 d). The data were analyzed using the MIXED procedure in SAS (SAS Inst. Inc., Cary, NC). SEQ80-120 and SEQ60-140 did not improve performance compared to CONV (P &gt; 0.05). However, ADFI, ADG and BW was higher for SEQ70-130 than CONV during phase 1 (1.49 vs 1.3 kg/d; 0.74 vs 0.65 kg/d; 46.55 vs 43.40 kg, respectively; P &lt; 0.05). During phase 2, BW tended to be higher for SEQ70-130 than CONV (69.20 vs 63.60 kg; P = 0.08). In the entire experimental period, ADFI tended to be higher for SEQ70-130 than CONV (2.08 vs 1.89 kg/d; P = 0.10). According to our results, sequential feeding program improves performance of growing–finishing at the beginning of the period.


2016 ◽  
Vol 69 (3) ◽  
Author(s):  
Heather Neville ◽  
Larry Broadfield ◽  
Claudia Harding ◽  
Shelley Heukshorst ◽  
Jennifer Sweetapple ◽  
...  

<p><strong>ABSTRACT</strong></p><p><strong>Background: </strong>Pharmacy technicians are expanding their scope of practice, often in partnership with pharmacists. In oncology, such a shift in responsibilities may lead to workflow efficiencies, but may also cause concerns about patient risk and medication errors.</p><p><strong>Objectives: </strong>The primary objective was to compare the time spent on order entry and order-entry checking before and after training of a clinical support pharmacy technician (CSPT) to perform chemotherapy order entry. The secondary objectives were to document workflow interruptions and to assess medication errors.</p><p><strong>Methods: </strong>This before-and-after observational study investigated chemotherapy order entry for ambulatory oncology patients. Order entry was performed by pharmacists before the process change (phase 1) and by 1 CSPT after the change (phase 2); order-entry checking was performed by a pharmacist during both phases. The tasks were timed by an independent observer using a personal digital assistant. A convenience sample of 125 orders was targeted for each phase. Data were exported to Microsoft Excel software, and timing differences for each task were tested with an unpaired <em>t </em>test.</p><p><strong>Results: </strong>Totals of 143 and 128 individual orders were timed for order entry during phase 1 (pharmacist) and phase 2 (CSPT), respectively. The mean total time to perform order entry was greater during phase 1 (1:37 min versus 1:20 min; <em>p </em>= 0.044). Totals of 144 and 122 individual orders were timed for order-entry checking (by a pharmacist) in phases 1 and 2, respectively, and there was no difference in mean total time for order-entry checking (1:21 min versus 1:20 min; <em>p </em>= 0.69). There were 33 interruptions not related to order entry (totalling 39:38 min) during phase 1 and 25 interruptions (totalling 30:08 min) during phase 2. Three errors were observed during order entry in phase 1 and one error during order-entry checking in phase 2; the errors were rated as having no effect on patient care.</p><p><strong>Conclusions: </strong>Chemotherapy order entry by a trained CSPT appeared to be just as safe and efficient as order entry by a pharmacist. Changes in pharmacy technicians’ scope of practice could increase the amount of time available for pharmacists to provide direct patient care in the oncology setting.</p><p><strong>RÉSUMÉ</strong></p><p><strong>Contexte : </strong>Les techniciens en pharmacie élargissent leur champ de pratique, souvent en partenariat avec les pharmaciens. En oncologie, un tel changement dans les responsabilités pourrait conduire à une optimisation de l’organisation du travail, mais il peut aussi soulever des inquiétudes au sujet des risques pour le patient et des erreurs de médicaments.</p><p><strong>Objectifs : </strong>L’objectif principal était de comparer le temps passé à la saisie d’ordonnances et à la vérification de cette saisie avant et après avoir formé un technicien en pharmacie dédié au soutien clinique (TPDSC) à la saisie d’ordonnances de chimiothérapie. Les objectifs secondaires étaient de répertorier les interruptions de travail et d’évaluer les erreurs de médicaments.</p><p><strong>Méthodes : </strong>La présente étude observationnelle avant-après s’est intéressée à la saisie d’ordonnances de  chimiothérapie pour les patients ambulatoires en oncologie. La saisie d’ordonnances était réalisée par des pharmaciens avant le changement de procédé (phase 1), puis, après le changement (phase 2), un TPDSC en avait la responsabilité. Un pharmacien vérifiait la saisie d’ordonnances au cours des deux phases. Les tâches étaient chronométrées par un observateur indépendant à l’aide d’un assistant numérique personnel. Un échantillon de  commodité de 125 ordonnances était souhaité pour chaque phase. Les données ont été consignées dans un tableur Excel de Microsoft et les écarts de temps pour chaque tâche ont été évalués à l’aide d’un test <em>t </em>pour échantillons indépendants.</p><p><strong>Résultats : </strong>Au total, on a chronométré le temps de saisie pour 143 ordonnances à la phase 1 (pharmacien), puis de 128 ordonnances pour la phase 2 (TPDSC). Le temps total moyen nécessaire pour saisir une ordonnance était plus long au cours de la phase 1 (1 min 37 s contre 1 min 20 s; <em>p </em>= 0,044). Au total, on a chronométré la vérification (réalisée par un pharmacien) de saisie pour 144 ordonnances à la phase 1 et 122 ordonnances à la phase 2. Aucune différence notable n’a été relevée dans le temps total moyen de vérification (1 min 21 s contre 1 min 20 s; <em>p </em>= 0,69). On a dénombré 33 interruptions sans lien à la saisie d’ordonnances (totalisant 39 min 38 s) au cours de la phase 1 et 25 interruptions (totalisant 30 min et 8 s) durant la phase 2. Trois erreurs à la saisie d’ordonnances ont été observées pendant la phase 1 et une erreur à la vérification de la saisie d’ordonnances pendant la phase 2; ces erreurs ont été jugées sans effet sur les soins aux patients.</p><p><strong>Conclusions : </strong>La saisie d’ordonnances de chimiothérapie par un TPDSC formé semblait être tout aussi sûre et efficiente que si elle était réalisée par un pharmacien. Les changements apportés au champ de pratique des techniciens en pharmacie pourraient accroître le temps dont disposent les pharmaciens pour prodiguer des soins directs aux patients en oncologie.</p>


2013 ◽  
Vol 1 (15) ◽  
pp. 1-208 ◽  
Author(s):  
S Mason ◽  
C O’Keeffe ◽  
A Carter ◽  
R O’Hara ◽  
C Stride

BackgroundA major reform of junior doctor training was undertaken in 2004–5, with the introduction of foundation training (FT) to address perceived problems with work structure, conditions and training opportunities for postgraduate doctors. The well-being and motivation of junior doctors within the context of this change to training (and other changes such as restrictions in working hours of junior doctors and increasing demand for health care) and the consequent impact upon the quality of care provided is not well understood.ObjectivesThis study aimed to evaluate the well-being of foundation year 2 (F2) doctors in training. Phase 1 describes the aims of delivering foundation training with a focus on the role of training in supporting the well-being of F2 doctors and assesses how FT is implemented on a regional basis, particularly in emergency medicine (EM). Phase 2 identifies how F2 doctor well-being and motivation are influenced over F2 and specifically in relation to EM placements and quality of care provided to patients.MethodsPhase 1 used semistructured interviews and focus groups with postgraduate deanery leads, training leads (TLs) and F2 doctors to explore the strategic aims and implementation of FT, focusing on the specialty of EM. Phase 2 was a 12-month online longitudinal study of F2 doctors measuring levels of and changes in well-being and motivation. In a range of specialties, one of which was EM, data from measures of well-being, motivation, intention to quit, confidence and competence and job-related characteristics (e.g. work demands, task feedback, role clarity) were collected at four time points. In addition, we examined F2 doctor well-being in relation to quality of care by reviewing clinical records (criterion-based and holistic reviews) during the emergency department (ED) placement relating to head injury and chronic obstructive pulmonary disease (COPD).ResultsPhase 1 of the study found that variation exists in how successfully FT is implemented locally; F2 lacks a clearly defined end point; there is a minimal focus on the well-being of F2 doctors (only on the few already shown to be ‘in difficulty’); the ED presented a challenging but worthwhile learning environment requiring a significant amount of support from senior ED staff; and disagreement existed about the performance and confidence levels of F2 doctors. A total of 30 EDs in nine postgraduate medical deaneries participated in phase 2 with 217 foundation doctors completing the longitudinal study. F2 doctors reported significantly increased confidence in managing common acute conditions and undertaking practical procedures over their second foundation year, with the biggest increase in confidence and competence associated with their ED placement. F2 doctors had levels of job satisfaction and anxiety/depression that were comparable to or better than those of other NHS workers, and adequate quality and safety of care are being provided for head injury and COPD.ConclusionsThere are ongoing challenges in delivering high-quality FT at the local level, especially in time-pressured specialties such as EM. There are also challenges in how FT detects and manages doctors who are struggling with their work. The survey was the first to document the well-being of foundation doctors over the course of their second year, and average scores compared well with those of other doctors and health-care workers. F2 doctors are benefiting from the training provided as we found improvements in perceived confidence and competence over the year, with the ED placement being of most value to F2 doctors in this respect. Although adequate quality of care was demonstrated, we found no significant relationships between well-being of foundation doctors and the quality of care they provided to patients, suggesting the need for further work in this area.FundingThe National Institute for Health Research Health Services and Delivery Research programme.


2019 ◽  
Author(s):  
Mandi L Klamerus ◽  
Laura J Damschroder ◽  
Jordan B Sparks ◽  
Sarah E Skurla ◽  
Eve A Kerr ◽  
...  

BACKGROUND Overtreatment and overtesting expose patients to unnecessary, wasteful, and potentially harmful care. Reducing overtreatment or overtesting that has become ingrained in current clinical practices and is being delivered on a routine basis will require solutions that incorporate a deep understanding of multiple perspectives, particularly those on the front lines of clinical care: patients and their clinicians. Design approaches are a promising and innovative way to incorporate stakeholder needs, desires, and challenges to develop solutions to complex problems. OBJECTIVE This study aimed (1) to engage patients in a design process to develop high-level deintensification strategies for primary care (ie, strategies for scaling back or stopping routine medical services that more recent evidence reveals are not beneficial) and (2) to engage both patients and primary care providers in further co-design to develop and refine the broad deintensification strategies identified in phase 1. METHODS We engaged stakeholders in design charrettes—intensive workshops in which key stakeholders are brought together to develop creative solutions to a specific problem—focused on deintensification of routine overuse in primary care. We conducted the study in 2 phases: a 6.5-hour design charrette with 2 different groups of patients (phase 1) and a subsequent 4-hour charrette with clinicians and a subgroup of phase 1 patients (phase 2). Both phases included surveys and educational presentations related to deintensification. Phase 1 involved several design activities (mind mapping, business origami, and empathy mapping) to help patients gain a deeper understanding of the individuals involved in deintensification. Following that, we asked participants to review hypothetical scenarios where patients, clinicians, or the broader health system context posed a barrier to deintensification and then to brainstorm solutions. The deintensification themes identified in phase 1 were used to guide phase 2. This second phase primarily involved 1 design activity (<italic>WhoDo</italic>). In this activity, patients and clinicians worked together to develop concrete actions that specific stakeholders could take to support deintensification efforts. This activity included identifying barriers to the actions and approaches to overcoming those barriers. RESULTS A total of 35 patients participated in phase 1, and 9 patients and 7 clinicians participated in phase 2. The analysis of the deintensification strategies and survey data is currently underway. The results are expected to be submitted for publication in early 2020. CONCLUSIONS Health care interventions are frequently developed without input from the people who are most affected. The exclusion of these stakeholders in the design process often influences and limits the impact of the intervention. This study employed design charrettes, guided by a flexible user-centered design model, to bring clinicians and patients with differing backgrounds and with different expectations together to cocreate real-world solutions to the complex issue of deintensifying medical services.


2021 ◽  
Vol 8 (Supplement_1) ◽  
pp. S150-S150
Author(s):  
Carlos M Nunez ◽  
Arun Mattappallil ◽  
Katie A McCrink ◽  
Debbie Rybak ◽  
Basil Taha ◽  
...  

Abstract Background Fluoroquinolone (FQ) antibiotics are frequently used in hospitalized patients to treat a wide range of infections but are often misused and implicated in antibiotic-associated adverse events. The purpose of this study is to evaluate the impact of Infectious Disease fellow (IDF)-driven antimicrobial stewardship program (ASP) interventions on inpatient FQ use. Methods This is a retrospective study of all admitted patients who received a FQ for greater than 48 hours from 01/01/2019 -12/31/2020 in an urban academic center. “Phase 1” (pre-intervention phase) covered 01/1/2019- 03/31/2019. “Phase 2” (intervention phase) covered 03/03/2020- 12/23/2020. In “Phase 2”, our ASP reviewed FQ use 2-3 days per week and an IDF provided feedback interventions that averaged 30-60 minutes of IDF time spent per day. We categorized FQ use as either: “appropriate”, “appropriate but not preferred”, or “inappropriate”, as determined by local clinical guidelines and ASP team opinion. We compared FQ use in both phases, indications for FQ use, and new Clostridioides difficile infections (CDI). Results A total of 386 patients are included (76 in “Phase 1”and 310 in “Phase 2”). Patient characteristics are similar (Table 1). Overall, 63 % of FQ use was empiric, and 50% FQ use was deemed “appropriate”, 28% “appropriate but not preferred”, and 22% “inappropriate”. In “Phase 2”, 126 interventions were conducted, with 86% of these accepted. Appropriate FQ use increased significantly in “Phase 2” vs. “Phase 1” (53.5% vs 35.5%, p = 0.008), with decrease in mean days of FQ use (4.38 days vs 5.87 days, p =.021). Table 2 shows “appropriate” FQ use by clinical indication. New CDIs occurred more in “Phase 1” vs. “Phase 2” (6.6% vs 0.6%, p=.001). Conclusion An IDF-driven ASP intervention has a positive impact on appropriate inpatient use of FQs in our hospital. This highlights a promising ASP model which not only improves appropriate use of FQ, but also offers an opportunity for IDF mentorship and use of available resources to promote ASPs. Disclosures Katie A. McCrink, PharmD, ViiV Healthcare (Employee)


2021 ◽  
Vol 8 (Supplement_1) ◽  
pp. S182-S182
Author(s):  
Xue Fen Valerie Seah ◽  
Yue Ling Rina Ong ◽  
Wei Ming Cedric Poh ◽  
Shahul Hameed Mohamed Siraj ◽  
Kai-Qian Kam ◽  
...  

Abstract Background Antimicrobial stewardship programs (ASP) aim to improve appropriate antimicrobial use. Post-operative antibiotics are generally not necessary, especially those without surgical site infections risk factors (e.g. obesity). Few studies have described the impact of ASP interventions on patient outcomes especially in unique populations such as obstetrics. This study aims to evaluate the impact of ASP interventions on post-elective caesarean (eLSCS) oral antibiotic prophylaxis use and patient outcomes including SSI rates. Methods This pre-post quasi-experimental study was conducted over 9 months (2 months pre- and 7 months post-intervention) in all women admitted for eLSCS in our institution. Interventions included eLSCS surgical prophylaxis guideline dissemination, where a single antibiotic dose within 60 minutes before skin incision was recommended. Post-eLSCS oral antibiotics was actively discouraged in those without SSI risk factors. This was followed by ASP intervention notes (phase 1) for 3 months, and an additional phone call to the ward team for the next 7 months (phase 2). Phase 3 (next 6 months) constituted speaking to the operating consultant. The primary outcome was post-operative oral antibiotics prescription rates. Secondary outcomes included rates of 30-day post-operative SSI. Results A total of 1751 women was reviewed. Appropriateness of pre-operative antibiotic prophylaxis was 98% in our institution. There were 244 women pre-intervention, 274 in post-intervention phase 1, 658 in phase 2 and 575 in phase 3. Pre-intervention post-eLSCS antibiotic prescribing rates was 82% (200), which reduced significantly post-intervention to 54% (148) in phase 1, 50% (331) in phase 2 and 39% (226) in phase 3 (p&lt; 0.001). There was no significant difference in patients who developed post-operative SSI pre-post intervention (0.8%, 2 of 242 vs. 1.9%, 28 of 1479, p=0.420) and among who received post-operative oral antibiotics compared to those without (1.9%, 17 of 905 vs. 1.5%, 13 of 846, p=0.582). Conclusion ASP interventions can reduce post-eLSCS antibiotic prophylaxis rates without adversely impacting patient safety. Disclosures All Authors: No reported disclosures


PLoS ONE ◽  
2020 ◽  
Vol 15 (11) ◽  
pp. e0241804
Author(s):  
Dong Eun Lee ◽  
Hyun Wook Ryoo ◽  
Sungbae Moon ◽  
Jeong Ho Park ◽  
Sang Do Shin

Improving outcomes after out-of-hospital cardiac arrests (OHCAs) requires an integrated approach by strengthening the chain of survival and emergency care systems. This study aimed to identify the change in outcomes over a decade and effect of citywide intervention on good neurologic outcomes after OHCAs in Daegu. This is a before- and after-intervention study to examine the association between the citywide intervention to improve the chain of survival and outcomes after OHCA. The primary outcome was a good neurologic outcome, defined as a cerebral performance category score of 1 or 2. After dividing into 3 phases according to the citywide intervention, the trends in outcomes after OHCA by primary electrocardiogram rhythm were assessed. Logistic regression analysis was used to analyze the association between the phases and outcomes. Overall, 6203 patients with OHCA were eligible. For 10 years (2008–2017), the rate of survival to discharge and the good neurologic outcomes increased from 2.6% to 8.7% and from 1.5% to 6.6%, respectively. Especially for patients with an initial shockable rhythm, these changes in outcomes were more pronounced (survival to discharge: 23.3% in 2008 to 55.0% in 2017, good neurologic outcomes: 13.3% to 46.0%). Compared with phase 1, the adjusted odds ratio (AOR) and 95% confidence intervals (CI) for good neurologic outcomes was 1.20 (95% CI: 0.78–1.85) for phase 2 and 1.64 (1.09–2.46) for phase 3. For patients with an initial shockable rhythm, the AOR for good neurologic outcomes was 3.76 (1.88–7.52) for phase 2 and 5.51 (2.77–10.98) for phase 3. Citywide improvement was observed in the good neurologic outcomes after OHCAs of medical origin, and the citywide intervention was significantly associated with better outcomes, particularly in those with initial shockable rhythm.


2020 ◽  
Author(s):  
Changju Liao ◽  
Linghong Guo ◽  
Han Wang ◽  
Tengyong Wang ◽  
Yuyang Zhang ◽  
...  

Abstract Background: Falls are serious public health problems associated with irreversible health consequences and substantial economic burden. To effectively reduce the incidence of falls and mitigate fall-related injuries, we designed and verified a multifactorial fall intervention model.Methods: The current study was a longitudinal before-and-after controlled investigation including 3 phases with clinical characteristics of fall patients retrospectively identified in phase 1, a multifactorial fall intervention model designed in phase 2 and prospectively evaluated in phase 3. Phase 1 and 2 were conducted based on 153,601 hospitalized patients between January 2015 and December 2016. Phase 3 was carried out based on 171,776 hospitalized patients between January 2017 and December 2018. The Pearson Chi-squared test was used to compare categorical variables and the Mann-Whitney non-parametric test was utilized for one-way ordered data.Results: In phase 1, baseline characteristics of 491 fall patients revealed that inpatients falls were highly associated with the age, medication and disease. In phase 2, a new multifactorial fall intervention model covering measures for fall prevention, fall-onset management and continuous improvement was developed. Phase 3 recorded a total of 396 falls and demonstrated a remarkably declined fall rate (Reduction in falls by 0.09%, p<0.001) and fall rate per 1000 patient-days (Reduction in falls/1000 patient-days by 0.07‰, p<0.001) as compared with phase 1. The adjusted incident rate ratio of fall was 1.443 (95%CI: 1.263-1.647) (Phase 1 vs. Phase 3). Furthermore, the occurrence and the severity of fall injuries in phase 3 were significantly lower than that in phase 1 (Z=-4.426, p<0.001). More specifically, the number of uninjured falls accounted for 42.42% in phase 3 in comparison of 32.99% in phase 1.Conclusions: This multifactorial fall intervention model exhibited favorable effect on reducing the occurrence of fall and fall injuries.


Sign in / Sign up

Export Citation Format

Share Document