scholarly journals Using evidence-based debriefing to combat moral distress in critical care nurses: A pilot project

2019 ◽  
Vol 9 (12) ◽  
pp. 1 ◽  
Author(s):  
Nicole M. Fontenot ◽  
Krista A. White

Objective: Moral distress (MD) is a problem for nurses that may cause despair or disempowerment. MD can have consequences like dissatisfaction or resignation from the nursing profession. Techniques such as evidence-based debriefing may help nurses with MD. Creating opportunities for critical care nurses to debrief about their MD might equip them with the tools needed to overcome it. Measuring MD by using the Moral Distress Thermometer (MDT) could provide insight into how debriefings help nurses. The purpose of this pilot project was to examine the impact of evidence-based debriefing sessions on critical care nurses’ sense of MD.Methods: This pilot project used a quasi-experimental, one-group, before-during-after design. Critical care nurses (N = 21) were recruited from one unit at a large academic medical center. Four debriefing sessions were held every 2 weeks. Participants completed the MDT 2 weeks before the first session, at the end of each session they attended, and 1 month after the debriefing sessions.Results: In the pilot project, participants felt that debriefing was helpful by increasing their self-awareness, giving them time to commune with colleagues, and encouraging them to improve self-care habits; however, MDT scores did not change significantly when comparing pre with post intervention scores (t(12) = 0.78, p = .450).Conclusions: The use of debriefing may help nurses gain self-awareness of MD and it may offer nurses strategies to build moral resilience.

2017 ◽  
Vol 28 (4) ◽  
pp. 351-358 ◽  
Author(s):  
Lesly Kelly ◽  
Michael Todd

Background:Burnout is a concern for critical care nurses in high-intensity environments. Studies have highlighted the importance of a healthy work environment in promoting optimal nurse and patient outcomes, but research examining the relationship between a healthy work environment and burnout is limited.Objective:To examine how healthy work environment components relate to compassion fatigue (eg, burnout, secondary trauma) and compassion satisfaction.Methods:Nurses (n = 105) in 3 intensive care units at an academic medical center completed a survey including the Professional Quality of Life and the American Association of Critical-Care Nurses’ Healthy Work Environment standards.Results:Regression models using each Healthy Work Environment component to predict each outcome, adjusting for background variables, showed that the 5 Healthy Work Environment components predicted burnout and that meaningful recognition and authentic leadership predicted compassion satisfaction.Conclusions:Findings on associations between healthy work environment standards and burnout suggest the potential importance of implementing the American Association of Critical-Care Nurses’ Healthy Work Environment standards as a mechanism for decreasing burnout.


1995 ◽  
Vol 4 (4) ◽  
pp. 280-285 ◽  
Author(s):  
MC Corley

BACKGROUND: Constraint of nurses by healthcare organizations, from actions the nurses believe are appropriate, may lead to moral distress. OBJECTIVE: To present findings on moral distress of critical care nurses, using an investigator-developed instrument. METHODS: An instrument development design using consensus by three expert judges, test-retest reliability, and factor analysis was used. Study participants (N = 111) were members of a chapter of the American Association of Critical-Care Nurses, critical care nurses employed in a large medical center, and critical care nurses from a private hospital. A 32-item instrument included items on prolonging life, performing unnecessary tests and treatments, lying to patients, and incompetent or inadequate treatment by physicians. RESULTS: Three factors were identified using factor analysis after expert consensus on the items: aggressive care, honesty, and action response. Nurses in the private hospital reported significantly greater moral distress on the aggressive care factor than did nurses in the medical center. Nurses not working in intensive care experienced higher levels of moral distress on the aggressive care factor than did nurses working in intensive care. Of the 111 nurses, 12% had left a nursing position primarily because of moral distress. CONCLUSIONS: Although the mean scores showed somewhat low levels of moral distress, the range of responses revealed that some nurses experienced high levels of moral distress with the issues. Research is needed on conditions organizations must provide to support the moral integrity of critical care nurses.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S412-S412
Author(s):  
Bhagyashri D Navalkele ◽  
Nora Truhett ◽  
Miranda Ward ◽  
Sheila Fletcher

Abstract Background High regulatory burden on hospital-onset (HO) infections has increased performance pressure on infection prevention programs. Despite the availability of comprehensive prevention guidelines, a major challenge has been communication with frontline staff to integrate appropriate prevention measures into practice. The objective of our study was to evaluate the impact of educational intervention on HO CAUTI rates and urinary catheter days. Methods At the University of Mississippi Medical Center, Infection prevention (IP) reports unit-based monthly HO infections via email to respective unit managers and ordering physician providers. Starting May 2018, IP assessed compliance to CAUTI prevention strategies per SHEA/IDSA practice recommendations (2014). HO CAUTI cases with noncompliance were labeled as “preventable” infections and educational justification was provided in the email report. No other interventions were introduced during the study period. CAUTI data were collected using ongoing surveillance per NHSN and used to calculate rates per 1,000 catheter days. One-way analysis of variance (ANOVA) was used to compare pre- and post-intervention data. Results Prior to intervention (July 2017–March 2018), HO CAUTI rate was 1.43 per 1,000 catheter days. In the post-intervention period (July 2018–March 2019), HO CAUTI rate decreased to 0.62 per 1,000 catheter days. Comparison of pre- and post-intervention rates showed a statistically significant reduction in HO CAUTIs (P = 0.04). The total number of catheter days reduced, but the difference was not statistically significant (8,604 vs. 7,583; P = 0.06). Of the 14 HO CAUTIs in post-intervention period, 64% (8/14) were reported preventable. The preventable causes included inappropriate urine culturing practice in asymptomatic patients (5) or as part of pan-culture without urinalysis (2), and lack of daily catheter assessment for necessity (1). Conclusion At our institute, regular educational feedback by IP to frontline staff resulted in a reduction of HO CAUTIs. Feedback measure improved accountability, awareness and engagement of frontline staff in practicing appropriate CAUTI prevention strategies. Disclosures All authors: No reported disclosures.


2012 ◽  
Vol 19 (4) ◽  
pp. 479-487 ◽  
Author(s):  
Debra L Wiegand ◽  
Marjorie Funk

Little is known about the consequences of moral distress. The purpose of this study was to identify clinical situations that caused nurses to experience moral distress, to understand the consequences of those situations, and to determine whether nurses would change their practice based on their experiences. The investigation used a descriptive approach. Open-ended surveys were distributed to a convenience sample of 204 critical care nurses employed at a university medical center. The analysis of participants’ responses used an inductive approach and a thematic analysis. Each line of the data was reviewed and coded, and the codes were collapsed into themes. Methodological rigor was established. Forty-nine nurses responded to the survey. The majority of nurses had experienced moral distress, and the majority of situations that caused nurses to experience moral distress were related to end of life. The nurses described negative consequences for themselves, patients, and families.


2017 ◽  
Vol 4 (suppl_1) ◽  
pp. S89-S89 ◽  
Author(s):  
Gregory Cook ◽  
Shreena Advani ◽  
Saira Rab ◽  
Sheetal Kandiah ◽  
Manish Patel ◽  
...  

Abstract Background A candidemia treatment bundle (CTB) may increase adherence to guideline recommended candidemia management and improve patient outcomes. The purpose of this study was to evaluate the impact of a best practice alert (BPA) and order-set on optimizing compliance with all CTB components and patient outcomes. Methods A single center, pre-/post-intervention study was completed at Grady Health System from August 2015 to August 2017. Post-CTB intervention began August 2016. The CTB included a BPA that fires for blood cultures positive for any Candida species to treatment clinicians upon opening the patient’s electronic health record. The BPA included a linked order-set based on treatment recommendations including: infectious diseases (ID) and ophthalmology consultation, repeat blood cultures, empiric echinocandin therapy, early source control, antifungal de-escalation, intravenous to oral (IV to PO) switch, and duration of therapy. The primary outcome of the study was total adherence to the CTB. The secondary outcomes include adherence with the individual components of the CTB, 30-day mortality, and infection-related length of stay (LOS). Results Forty-five patients in the pre-group and 24 patients in the CTB group with candidemia were identified. Twenty-seven patients in the pre-group and 19 patients in the CTB group met inclusion criteria. Total adherence with the CTB occurred in one patient in the pre-group and threepatients in the CTB group (4% vs. 16%, P = 0.29). ID was consulted in 15 patients in the pre-group and 17 patients in the CTB group (56% vs. 89%, P = 0.02). Source control occurred in three and 11 patients, respectively (11% vs. 58% P < 0.01). The bundle components of empiric echinocandin use (81% vs. 100%, P = 0.07), ophthalmology consultation (81% vs. 95%, P = 0.37), and IV to PO switch (22% vs. 32%, P = 0.5) also improved in the CTB group. Repeat cultures and antifungal de-escalation were similar among groups. Thirty-day mortality decreased in the CTB group by 10% (26% vs. 16%, P = 0.48). Median iLOS decreased from 30 days in the pre-group to 17 days in the CTB group (P = 0.05). Conclusion The CTB, with a BPA and linked order-set, improved guideline recommended management of candidemia specifically increasing the rates of ID consultation and early source control. There were quantitative improvements in mortality and iLOS. Disclosures All authors: No reported disclosures.


2020 ◽  
Vol 41 (S1) ◽  
pp. s256-s258
Author(s):  
Mary Kukla ◽  
Shannon Hunger ◽  
Tacia Bullard ◽  
Kristen Van Scoyoc ◽  
Mary Beth Hovda-Davis ◽  
...  

Background: Central-line–associated bloodstream infection (CLABSI) rates have steadily decreased as evidence-based prevention bundles were implemented. Bone marrow transplant (BMT) patients are at increased risk for CLABSI due to immunosuppression, prolonged central-line utilization, and frequent central-line accesses. We assessed the impact of an enhanced prevention bundle on BMT nonmucosal barrier injury CLABSI rates. Methods: The University of Iowa Hospitals & Clinics is an 811-bed academic medical center that houses the only BMT program in Iowa. During October 2018, we added 3 interventions to the ongoing CLABSI prevention bundle in our BMT inpatient unit: (1) a standardized 2-person dressing change team, (2) enhanced quality daily chlorhexidine treatments, and (3) staff and patient line-care stewardship. The bundle included training of nurse champions to execute a team approach to changing central-line dressings. Standard process description and supplies are contained in a cart. In addition, 2 sets of sterile hands and a second person to monitor for breaches in sterile procedure are available. Site disinfection with chlorhexidine scrub and dry time are monitored. Training on quality chlorhexidine bathing includes evaluation of preferred product, application per product instructions for use and protection of the central-line site with a waterproof shoulder length glove. In addition to routine BMT education, staff and patients are instructed on device stewardship during dressing changes. CLABSIs are monitored using NHSN definitions. We performed an interrupted time-series analysis to determine the impact of our enhanced prevention bundle on CLABSI rates in the BMT unit. We used monthly CLABSI rates since January 2017 until the intervention (October 2018) as baseline. Because the BMT changed locations in December 2018, we included both time points in our analysis. For a sensitivity analysis, we assessed the impact of the enhanced prevention bundle in a hematology-oncology unit (March 2019) that did not change locations. Results: During the period preceding bundle implementation, the CLABSI rate was 2.2 per 1,000 central-line days. After the intervention, the rate decreased to 0.6 CLABSI per 1,000 central-line days (P = .03). The move in unit location did not have a significant impact on CLABSI rates (P = .85). CLABSI rates also decreased from 1.6 per 1,000 central-line days to 0 per 1,000 central-line days (P < .01) in the hematology-oncology unit. Conclusions: An enhanced CLABSI prevention bundle was associated with significant decreases in CLABSI rates in 2 high-risk units. Novel infection prevention bundle elements should be considered for special populations when all other evidence-based recommendations have been implemented.Funding: NoneDisclosures: None


2016 ◽  
Author(s):  
◽  
Melissa A. Wilson

This dissertation explores the presence of moral distress and effective interventions to lessen its impact on critical care nurses. Manuscript one was completed prior to entering the doctor of philosophy in nursing program but was instrumental in building the foundation for successive work within this dissertation. An exploratory, descriptive designed study was used to examine moral distress and identify situations in which nurse's experienced high levels of moral distress. Nurses completed a 38-item moral distress scale, a coping questionnaire, and indicated their preferred methods for institutional support in managing distressing situations. Manuscript two includes a formal analysis of the Moral Distress Theory and identified limitations in the existing theoretical model based on a review of literature. Finally, manuscript three is a study identifying barriers and values during moral distress situations that can be used to potentially target interventions aimed at lessening the impact of moral distress.


2017 ◽  
Vol 4 (suppl_1) ◽  
pp. S398-S398 ◽  
Author(s):  
Werner Bischoff ◽  
Andrey Bubnov ◽  
Elizabeth Palavecino ◽  
James Beardsley ◽  
John Williamson ◽  
...  

Abstract Background Clostridium difficile infections (CDI) pose a growing threat to hospitalized patients. This study assesses the impact of changing from a nucleic acid amplification test (NAAT) to a stepwise testing algorithm (STA) by using an enzyme immunoassay (GDH and toxin A/B) and confirmatory NAAT confirmation in specific cases. Methods In an 885 bed academic medical center a 24 month pre-/post design was used to assess the effect of the STA for the following parameters: rates of enterocolitis due to C.diff (CDE), NHSN C.diff LabID events, CDI complications, mortality, antimicrobial prescription patterns, cluster occurrences; and testing, treatment, and isolation costs. Inpatient data were extracted from ICD-9/10 diagnosis codes, infection prevention, and laboratory databases. Results The STA significantly decreased the number of CDE ICD9/10 codes, HO, CO, and CO-HCFA C.diff LabID event rates by 65%, 78%, 75%, and 75%, respectively. Similar reductions were noted for associated complications such as NHSN defined colon surgeries (-61%), megacolon (-64%), and acute kidney failure (-55%). CDE unrelated complication rates for colon surgeries and acute kidney failure remained constant while the diagnosis of megacolon decreased but not significantly (-71%; P &gt; 0.05). Inpatient mortality did not change with or without CDE. Significant reductions were observed in the use of oral metronidazole (total: -32%; CDE specific: -70%) and vancomycin (total: -58%; CDE specific: -61%). There were no clusters detected pre-/post STA introduction. The need for isolation decreased from 748 to 181 patients post-intervention (-76%; P &lt; 0.05). Annual cost savings were over $175,000 due to decreases in laboratory testing followed by isolation, and antibiotic use. Conclusion The switch to an STA from NAAT did not affect the diagnosis, treatment, or control of clinically relevant CDI in our institution. Benefits included avoidance of unnecessary antibiotic treatment, reduction in isolation, achieving publicly reported objectives, and costs savings. Selection of clinically relevant tests can help to improve hospitalization and treatment of patients and should be considered as part of diagnostic stewardship. Disclosures All authors: No reported disclosures.


2016 ◽  
Vol 37 (11) ◽  
pp. 1361-1366 ◽  
Author(s):  
Elizabeth A. Neuner ◽  
Andrea M. Pallotta ◽  
Simon W. Lam ◽  
David Stowe ◽  
Steven M. Gordon ◽  
...  

OBJECTIVETo describe the impact of rapid diagnostic microarray technology and antimicrobial stewardship for patients with Gram-positive blood cultures.DESIGNRetrospective pre-intervention/post-intervention study.SETTINGA 1,200-bed academic medical center.PATIENTSInpatients with blood cultures positive for Staphylococcus aureus, Enterococcus faecalis, E. faecium, Streptococcus pneumoniae, S. pyogenes, S. agalactiae, S. anginosus, Streptococcus spp., and Listeria monocytogenes during the 6 months before and after implementation of Verigene Gram-positive blood culture microarray (BC-GP) with an antimicrobial stewardship intervention.METHODSBefore the intervention, no rapid diagnostic technology was used or antimicrobial stewardship intervention was undertaken, except for the use of peptide nucleic acid fluorescent in situ hybridization and MRSA agar to identify staphylococcal isolates. After the intervention, all Gram-positive blood cultures underwent BC-GP microarray and the antimicrobial stewardship intervention consisting of real-time notification and pharmacist review.RESULTSIn total, 513 patients with bacteremia were included in this study: 280 patients with S. aureus, 150 patients with enterococci, 82 patients with stretococci, and 1 patient with L. monocytogenes. The number of antimicrobial switches was similar in the pre–BC-GP (52%; 155 of 300) and post–BC-GP (50%; 107 of 213) periods. The time to antimicrobial switch was significantly shorter in the post–BC-GP group than in the pre–BC-GP group: 48±41 hours versus 75±46 hours, respectively (P<.001). The most common antimicrobial switch was de-escalation and time to de-escalation, was significantly shorter in the post-BC-GP group than in the pre–BC-GP group: 53±41 hours versus 82±48 hours, respectively (P<.001). There was no difference in mortality or hospital length of stay as a result of the intervention.CONCLUSIONSThe combination of a rapid microarray diagnostic test with an antimicrobial stewardship intervention improved time to antimicrobial switch, especially time to de-escalation to optimal therapy, in patients with Gram-positive blood cultures.Infect Control Hosp Epidemiol 2016;1–6


2006 ◽  
Vol 19 (5) ◽  
pp. 275-279 ◽  
Author(s):  
Jennifer E. Stark ◽  
Kimi S. Vesta

Venous thromboembolism (VTE) is among the most preventable causes of hospital death; however, there is a significant underuse of VTE prophylaxis. The purpose of this study was to determine the impact of a pharmacist-initiated screening method on VTE prophylaxis rates. Clinical pharmacists practicing in an internal medicine teaching service at an academic medical center conducted a 6-month pilot project. Consecutive patients admitted to the service were screened for VTE and bleeding risk factors. Pharmacists made recommendations to the physicians in person, provided monthly educational presentations, and monitored patients daily until discharge to confirm continued appropriateness of recommendations. Of the 444 patients who were screened, 107 were identified to be candidates for VTE prophylaxis, and 21 of these patients also had bleeding risk factors. Appropriate use was significantly better after the screening intervention (37% before vs 85% after; P < .05). Moreover, inappropriate use in patients with bleeding risk factors was avoided by the screening intervention (29% before vs 0% after; P < .05). Clear improvements in VTE prophylaxis rates were observed. This pharmacist-initiated screening method presents unique opportunities for pharmacists.


Sign in / Sign up

Export Citation Format

Share Document