Online Surveys: A Potential Weapon Against Clinician Non-Compliance

2012 ◽  
Vol 17 (1) ◽  
pp. 38-41 ◽  
Author(s):  
Gregory J. Schears

Abstract Over the last two decades, our understanding of the pathogenesis of central line associated blood steam infections has improved significantly. Also, increased attention has been focused on reducing healthcare worker exposure to infectious agents. Best practice protocols have been developed to eliminate unnecessary morbidity, mortality and costs associated with these infections and exposures. Adoption of these best practices has been incomplete and non-compliance is a major factor preventing our infectious complication rates from reaching zero. Getting at the root cause of noncompliance is complex. Online surveys are uniquely positioned to help understand the human factors contributing to non-compliance. This article reviews some of the pros and cons associated with the use of online surveys. Using several relevant recent examples, this article explores how these surveys can be used to identify those factors that create barriers to compliance. By better understanding all the issues involved with non-compliance, we will be able to create strategies and engineer products to improve best practice protocol compliance and reduce the human factor contribution to our patient's infectious complications.

Stroke ◽  
2020 ◽  
Vol 51 (Suppl_1) ◽  
Author(s):  
Kari D Moore ◽  
Lynn Hundley ◽  
Polly Hunt ◽  
Bill Singletary ◽  
Allison Merritt ◽  
...  

Background: Evidence shows systems change interventions improve care and outcomes for stroke patients. Geopolitical boundaries have been a barrier to improving regional systems of care. Despite efforts nationally, regionally, and locally alteplase use for ischemic stroke has remained low and door to needle (DTN) times exceeded 60 minutes. Kentucky created the Stroke Encounter Quality Improvement Project (SEQIP) in 2009 to share best practices and improve stroke systems of care across the Commonwealth. Purpose: The aim was to utilize and share best practice models among 23 SEQIP hospitals in KY to improve tPA utilization, decrease DTN times, and improve outcomes. Methods: Hospitals implemented a statewide quality improvement plan focused on identifying barriers, removing barriers, and implementing best practice strategies regarding thrombolytic therapy. Accountability was achieved with ongoing GWTG data tracking, teleconferences, and face to face meetings from January 2009 through December 2018 sharing strategies and solutions for best practice. Results: SEQIP’s participating hospitals achieved significant improvement in thrombolytic administration over 10 years. The percent of all AIS patients receiving tPA increased from 4.61% in 2009 to 8.80% in 2018 (OR=2.0, p <0.0001). Alteplase use in eligible patients arriving by 2 hours and treated by 3 hours improved from 59.6% to 88.5% (OR=5.2, p <0.0001). Alteplase use in eligible patients arriving by 3.5 hours to 4.5 hours increased from 24.9% to 55.1% (OR=5.0, p <0.0001). Median DTN times decreased from 74 minutes to 49 minutes (p<0.0001). Complication rates of symptomatic hemorrhage were consistent with NINDS data and < 6% from 2009-2018. The tPA in-hospital mortality rate in 2009 was 11.7% and by 2018, decreased to 3.6% (p=0.00016). In 2009, 28.4% of tPA patients were discharged home and by 2018, that had increased to 47.9% (p <0.00001). In 2009, 32.1% of tPA patients were able to walk independently at d/c and by 2018 had increased to 43.6% (p = 0.00359). Conclusions: Geopolitical boundaries can be overcome and collaboration can be sustained among competing hospitals through sharing of best practices to safely increase utilization of tPA in eligible patients, decrease DTN times, and improve outcomes.


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S446-S446
Author(s):  
Katie Ip ◽  
Leah M Shayer ◽  
susan m lerner ◽  
Leona Kim-Schluger ◽  
Jang Moon

Abstract Background Central line-associated blood stream infections (CLABSI) have a significant impact on mortality, morbidity and length of stay. Data collected by the Infection Prevention Department revealed progressive increases in the rate of CLABSI on an Abdominal Transplant Unit. Recognizing a drift from best practice, front line staff, the IP team and vascular access specialists, collaborated to identify opportunities for improving care of patients with vascular access devices. Methods An increase in CLABSI rate was observed on the Abdominal Transplant Unit beginning in 2016. An initiative began in 2017 to evaluate whether CLABSI rate reduction was sustainable for at least 1 year and to identify key determinants of this sustainability. Interventions were aimed at infection prevention best practices, care standardization, and team-based monitoring. Interventions included (1) re-education on CLABSI reduction, (2) two RN dressing changes to validate practice during central line dressing change, (3) blood draws from central lines (during non-emergent situations) had to be approved by nurse manager, physician lead and transplant quality physician, (4) CLABSI prevention nurses were chosen as designated phlebotomists for patients with prior approval, (5) daily line review was performed to address line days, indication of line (remove latent lines) and plan of care (transition to permanent access) and this information was shared with the unit physician lead and transplant quality team. Assuring compliance with audits and timely feedback with clinician accountability were vital with compliance with best practices. Results Conclusion During the intervention, CLABSI infection rates dropped from 4.825 to 1.533 in 1,000 CVC days. The sustainability plan for this program is to continue line audits, assessing line necessity and review the effectiveness of the initiatives, review all new CLABSI data with staff and implement new changes as necessary. Joint, ongoing multidisciplinary collaboration is essential to reduce CLABSIs and optimize quality in a challenging, high-acuity patient population. Disclosures All Authors: No reported disclosures


2021 ◽  
Vol 6 (9) ◽  
pp. e006780
Author(s):  
Ilana Seff ◽  
Luissa Vahedi ◽  
Samantha McNelly ◽  
Elfriede Kormawa ◽  
Lindsay Stark

Although programmes and policies targeting violence against women and girls (VAWG) have increased in the past decade, there is a paucity of evidence on the effectiveness of these interventions. To expand this evidence base, researchers increasingly employ remote data collection (RDC)—including online surveys, mobile applications and telephone interviews—in their evaluations. Although RDC allows for evaluations without in-person interactions—which are restricted during crises such as the COVID-19 pandemic— information about these methods is necessary to understand their potential usefulness and limitations. This scoping review examines remote evaluations of VAWG interventions to describe the landscape of RDC methods, reflect on safety and ethical considerations, and offer best practices for RDC in VAWG research. Fourteen studies met eligibility criteria, with seven, five, and two studies employing telephone interviews, online surveys, and mobile applications, respectively. Studies commonly stated that participants were asked to use a safe email or device, but the method for verifying such safety was rarely specified. Best practices around safety included creating a ‘quick escape’ button for online data collection to use when another individual was present, explaining to participants how to erase browsing history and application purchases, and asking participants to specify a safe time for researchers to call. Only eight studies established referral pathways for respondents as per best practice. None of the eligible studies took place in low/middle-income countries (LMICs) or humanitarian settings, likely reflecting the additional challenges to using RDC methods in lower resource settings. Findings were used to create a best practice checklist for programme evaluators and Institutional Review Boards using RDC for VAWG interventions. The authors found that opportunities exist for researchers to safely and effectively use RDC methodologies to gather VAWG data, but that further study is needed to gauge the feasibility of these methods in LMICs and humanitarian settings.


2021 ◽  
pp. 000313482110111
Author(s):  
Nicholas J. Iglesias ◽  
Taylor P. Williams ◽  
Clifford L. Snyder ◽  
Christian Sommerhalder ◽  
Alexander Perez

Background Central line-associated bloodstream infections (CLABSIs) are preventable complications that pose a significant health risk to patients and place a financial burden on hospitals. Central line simulation-based education (SBE) efforts vary widely in the literature. The aim of this study was to perform a value analysis of published central line SBE and develop a refined method of studying central line SBE. Methods A database search of PubMed Central and Cumulative Index to Nursing and Allied Health Literature (CINAHL) was performed for articles mentioning “Cost and CLABSI,” “Cost and Central line Associated Bloodstream Infections,” and “Cost and Central Line” in their abstract and article body. Articles chosen for qualitative synthesis mentioned “simulation” in their abstract and article body and were analyzed based on the following criteria: infection rate before vs. after SBE, cost of simulation, SBE design including simulator model used, and learner analysis. Results Of 215 articles identified, 23 were analyzed, 10 (43.48%) discussed cost of central line simulation with varying criteria for cost reporting, 8 (34.8%) numerically discussed central line complication rates (7 CLABSIs and 1 pneumothorax), and only 3 (13%) discussed both (Figure). Only 1 addressed the true cost of simulation (including space rental, equipment startup costs, and faculty salary) and its longitudinal effect on CLABSIs. Conclusion Current literature on central line SBE efforts lacks value propositions. Due to the lack of value-based data in the area of central line SBE, the authors propose a cost reporting standard for use by future studies reporting central line SBE costs.


Symmetry ◽  
2020 ◽  
Vol 13 (1) ◽  
pp. 9
Author(s):  
John H. Graham

Best practices in studies of developmental instability, as measured by fluctuating asymmetry, have developed over the past 60 years. Unfortunately, they are haphazardly applied in many of the papers submitted for review. Most often, research designs suffer from lack of randomization, inadequate replication, poor attention to size scaling, lack of attention to measurement error, and unrecognized mixtures of additive and multiplicative errors. Here, I summarize a set of best practices, especially in studies that examine the effects of environmental stress on fluctuating asymmetry.


2021 ◽  
Vol 10 (Supplement_1) ◽  
pp. S16-S16
Author(s):  
Jennia J Acebo ◽  
María Costta ◽  
Gisella Sánchez ◽  
Erika Villanueva ◽  
Erika Montalvo E ◽  
...  

Abstract Introduction Pediatric cancer patients merit the placement of central lines for the treatments they receive. Subcutaneous central ports (SCs) and peripherally inserted central catheters (PICCs) are the most frequently used lines. PICCs have gained popularity due to the ease of insertion, which can be invaluable in the pediatric oncology setting for administration of intravenous therapy, parenteral nutrition, and/or blood products. Since central-line-associated bloodstream infections increase the morbidity and mortality of cancer patients, as well as increase the cost generated by their treatment, active surveillance of these healthcare-associated infections is warranted. Methods This is a retrospective descriptive study of pediatric patients treated via PICCs at the Hospital SOLCA Núcleo Quito between 2009 and 2019. Results During the study period, 70 PICC lines were placed in 66 patients, totaling 1862 catheter-days. The majority of patients (75.7%) were diagnosed with leukemia or lymphoma. As of 2011, all PICCs were placed in the operating room by a surgeon. Ultrasound was used 39 times for the insertion of PICCs. Inadequate peripheral venous access was the most common indication (64.2%) for placement. Twenty-nine PICCs had complications, of which 13 were infectious complications and 16 were noninfectious. The most common infectious complication was PICC-related bloodstream infection (13), and the most frequent noninfectious complication was occlusion (10). The overall complication rate was 15.5 complications per 1000 catheter-days, and the overall infectious complication rate was 6.9 complications per 1000 catheter days. Annual complication rates fluctuated over the study period. The PICC line-associated infection rate per 1000 catheter-days was 13.1‰ in 2009, 12.4‰ in 2010, 5.0‰ in 2011, 7.9‰ in 2012, 0 in 2013, 13.4‰ in 2014, 4.8‰ in 2015, 16.2‰ in 2016, 8.2‰ in 2017, and 4.3‰ in 2018. Conclusion In general, complications related to PICC in pediatric patients at a tertiary care oncology hospital have fluctuated over the years. Our findings indicate the need for further efforts in staff education and training in the insertion, care, and maintenance of PICC lines. Best practice guidelines are also critical to reducing complications, especially occlusion and infection rates, to thereby improve patient outcomes.


2021 ◽  
pp. 112972982110008
Author(s):  
Patrick Kennedy ◽  
Darren Klass ◽  
John Chung

Transradial access is a safe approach for visceral endovascular interventions, with lower complication rates compared to transfemoral access. This report describes an unusual case of ulnar artery thrombosis following splenic artery aneurysm embolization via left transradial approach, resulting in non-target digital ischemia and eventual amputation of the ring and little finger distal phalanges. Technical considerations to reduce the incidence of access complications are also reviewed, along with practice modifications undertaken at our institution following this case to improve outcomes.


SAGE Open ◽  
2016 ◽  
Vol 6 (4) ◽  
pp. 215824401667774 ◽  
Author(s):  
Benjamin Woodward ◽  
Reba Umberger

Central line-associated bloodstream infections (CLABSI) are a very common source of healthcare-associated infection (HAI). Incidence of CLABSI has been significantly reduced through the efforts of nurses, healthcare providers, and infection preventionists. Extrinsic factors such as recently enacted legislation and mandatory reporting have not been closely examined in relation to changes in rates of HAI. The following review will examine evidence-based practices related to CLABSI and how they are reported, as well as how the Affordable Care Act, mandatory reporting, and pay-for-performance programs have affected these best practices related to CLABSI prevention. There is a disconnect in the methods and guidelines for reporting CLABSI between these programs, specifically among local monitoring agencies and the various federal oversight organizations. Future research will focus on addressing the gap in what defines a CLABSI and whether or not these programs to incentivize hospital to reduce CLABSI rates are effective.


Sign in / Sign up

Export Citation Format

Share Document