Navigating the Evidence-Based Practice Maze

2012 ◽  
Vol 48 (3) ◽  
pp. 159-166 ◽  
Author(s):  
Valerie L. Mazzotti ◽  
Dawn R. Rowe ◽  
David W. Test

Factors such as the standards-based education movement, mandated participation in statewide testing, and inclusion have forced an increased focus on improving outcomes for students with disabilities. There are many determinants of postschool success for students with disabilities; however, teachers primarily have control over only one, teaching practices and programs. As a result, it is important that teachers choose and implement practices that have proven successful for secondary students with disabilities. This article guides teachers through the process of navigating the evidence-based practice maze to identify evidence-based practices and programs for secondary students with disabilities. Particularly, it addresses the need to (a) follow a research-based framework (i.e., Kohler’s Taxonomy), (b) use practices with the best available research evidence to support effectiveness, and (c) use data-based decision making to guide use of evidence-based practices.

2009 ◽  
Vol 89 (9) ◽  
pp. 918-933 ◽  
Author(s):  
Joe Schreiber ◽  
Perri Stern ◽  
Gregory Marchetti ◽  
Ingrid Provident

BackgroundThe physical therapy profession has been perceived as one that bases its practice largely on anecdotal evidence and that uses treatment techniques for which there is little scientific support. Physical therapists have been urged to increase evidence-based practice behaviors as a means to address this perception and to enhance the translation of knowledge from research evidence into clinical practice. However, little attention has been paid to the best ways in which to support clinicians’ efforts toward improving evidence-based practice.ObjectivesThe purpose of this study was to identify, implement, and evaluate the effectiveness of strategies aimed at enhancing the ability of 5 pediatric physical therapists to integrate scientific research evidence into clinical decision making.DesignThis study was a formative evaluation pilot project.MethodsThe participants in this study collaborated with the first author to identify and implement strategies and outcomes aimed at enhancing their ability to use research evidence during clinical decision making. Outcome data were analyzed with qualitative methods.ResultsThe participants were able to implement several, but not all, of the strategies and made modest self-reported improvements in evidence-based practice behaviors, such as reading journal articles and completing database searches. They identified several barriers, including a lack of time, other influences on clinical decision making, and a lack of incentives for evidence-based practice activities.ConclusionsThe pediatric physical therapists who took part in this project had positive attitudes toward evidence-based practice and made modest improvements in this area. It is critical for the profession to continue to investigate optimal strategies to aid practicing clinicians in applying research evidence to clinical decision making.


10.2196/17718 ◽  
2020 ◽  
Vol 22 (8) ◽  
pp. e17718
Author(s):  
Monika Jurkeviciute ◽  
Henrik Eriksson

Background Evidence-based practice refers to building clinical decisions on credible research evidence, professional experience, and patient preferences. However, there is a growing concern that evidence in the context of electronic health (eHealth) is not sufficiently used when forming policies and practice of health care. In this context, using evaluation and research evidence in clinical or policy decisions dominates the discourse. However, the use of additional types of evidence, such as professional experience, is underexplored. Moreover, there might be other ways of using evidence than in clinical or policy decisions. Objective This study aimed to analyze how different types of evidence (such as evaluation outcomes [including patient preferences], professional experiences, and existing scientific evidence from other research) obtained within the development and evaluation of an eHealth trial are used by diverse stakeholders. An additional aim was to identify barriers to the use of evidence and ways to support its use. Methods This study was built on a case of an eHealth trial funded by the European Union. The project included 4 care centers, 2 research and development companies that provided the web-based physical exercise program and an activity monitoring device, and 2 science institutions. The qualitative data collection included 9 semistructured interviews conducted 8 months after the evaluation was concluded. The data analysis concerned (1) activities and decisions that were made based on evidence after the project ended, (2) evidence used for those activities and decisions, (3) in what way the evidence was used, and (4) barriers to the use of evidence. Results Evidence generated from eHealth trials can be used by various stakeholders for decisions regarding clinical integration of eHealth solutions, policy making, scientific publishing, research funding applications, eHealth technology, and teaching. Evaluation evidence has less value than professional experiences to local decision making regarding eHealth integration into clinical practice. Professional experiences constitute the evidence that is valuable to the highest variety of activities and decisions in relation to eHealth trials. When using existing scientific evidence related to eHealth trials, it is important to consider contextual relevance, such as location or disease. To support the use of evidence, it is suggested to create possibilities for health care professionals to gain experience, assess a few rather than a large number of variables, and design for shorter iterative cycles of evaluation. Conclusions Initiatives to support and standardize evidence-based practice in the context of eHealth should consider the complexities in how the evidence is used in order to achieve better uptake of evidence in practice. However, one should be aware that the assumption of fact-based decision making in organizations is misleading. In order to create better chances that the evidence produced would be used, this should be addressed through the design of eHealth trials.


2020 ◽  
Author(s):  
Monika Jurkeviciute ◽  
Henrik Eriksson

BACKGROUND Evidence-based practice refers to building clinical decisions on credible research evidence, professional experience, and patient preferences. However, there is a growing concern that evidence in the context of electronic health (eHealth) is not sufficiently used when forming policies and practice of health care. In this context, using evaluation and research evidence in clinical or policy decisions dominates the discourse. However, the use of additional types of evidence, such as professional experience, is underexplored. Moreover, there might be other ways of using evidence than in clinical or policy decisions. OBJECTIVE This study aimed to analyze how different types of evidence (such as evaluation outcomes [including patient preferences], professional experiences, and existing scientific evidence from other research) obtained within the development and evaluation of an eHealth trial are used by diverse stakeholders. An additional aim was to identify barriers to the use of evidence and ways to support its use. METHODS This study was built on a case of an eHealth trial funded by the European Union. The project included 4 care centers, 2 research and development companies that provided the web-based physical exercise program and an activity monitoring device, and 2 science institutions. The qualitative data collection included 9 semistructured interviews conducted 8 months after the evaluation was concluded. The data analysis concerned (1) activities and decisions that were made based on evidence after the project ended, (2) evidence used for those activities and decisions, (3) in what way the evidence was used, and (4) barriers to the use of evidence. RESULTS Evidence generated from eHealth trials can be used by various stakeholders for decisions regarding clinical integration of eHealth solutions, policy making, scientific publishing, research funding applications, eHealth technology, and teaching. Evaluation evidence has less value than professional experiences to local decision making regarding eHealth integration into clinical practice. Professional experiences constitute the evidence that is valuable to the highest variety of activities and decisions in relation to eHealth trials. When using existing scientific evidence related to eHealth trials, it is important to consider contextual relevance, such as location or disease. To support the use of evidence, it is suggested to create possibilities for health care professionals to gain experience, assess a few rather than a large number of variables, and design for shorter iterative cycles of evaluation. CONCLUSIONS Initiatives to support and standardize evidence-based practice in the context of eHealth should consider the complexities in how the evidence is used in order to achieve better uptake of evidence in practice. However, one should be aware that the assumption of fact-based decision making in organizations is misleading. In order to create better chances that the evidence produced would be used, this should be addressed through the design of eHealth trials.


2020 ◽  
pp. 030802262094139
Author(s):  
Helen Jeffery ◽  
Linda Robertson ◽  
Kim L Reay

Introduction Evidence-based practice skills and habits begin during undergraduate education and continue through professional life. It is important novices learn the skills in their education programme that are required in practice. This study explores strategies experienced occupational therapy supervisors use to encourage novices to be evidence based, and how these might be enhanced. Method Qualitative descriptive methodology was used to explore the views and experiences of 15 experienced supervisors from a range of practice areas and geographical locations, interviewed in four focus groups. Results Evidence-based practice is an element of professional reasoning not isolated from client-centred practice or from reflective practice. Five sources of evidence to inform competence in professional decision-making were identified: (a) research evidence from literature; (b) local environment, resources and culture; (c) client’s expertise and perspective; (d) expertise of others; and (e) practitioners’ own knowledge and experience. Conclusion Intentional use of all five sources of evidence to inform professional decision-making contributes to habits of evidence-based thinking and practice. Experienced therapists and educators can support evidence-based practice in novices by prompting questioning and developing systems supportive of scanning for evidence in each area. Collaboration in this endeavour will enhance integration of academic and practice education.


2020 ◽  
Vol 29 (2) ◽  
pp. 688-704
Author(s):  
Katrina Fulcher-Rood ◽  
Anny Castilla-Earls ◽  
Jeff Higginbotham

Purpose The current investigation is a follow-up from a previous study examining child language diagnostic decision making in school-based speech-language pathologists (SLPs). The purpose of this study was to examine the SLPs' perspectives regarding the use of evidence-based practice (EBP) in their clinical work. Method Semistructured phone interviews were conducted with 25 school-based SLPs who previously participated in an earlier study by Fulcher-Rood et al. 2018). SLPs were asked questions regarding their definition of EBP, the value of research evidence, contexts in which they implement scientific literature in clinical practice, and the barriers to implementing EBP. Results SLPs' definitions of EBP differed from current definitions, in that SLPs only included the use of research findings. SLPs seem to discuss EBP as it relates to treatment and not assessment. Reported barriers to EBP implementation were insufficient time, limited funding, and restrictions from their employment setting. SLPs found it difficult to translate research findings to clinical practice. SLPs implemented external research evidence when they did not have enough clinical expertise regarding a specific client or when they needed scientific evidence to support a strategy they used. Conclusions SLPs appear to use EBP for specific reasons and not for every clinical decision they make. In addition, SLPs rely on EBP for treatment decisions and not for assessment decisions. Educational systems potentially present other challenges that need to be considered for EBP implementation. Considerations for implementation science and the research-to-practice gap are discussed.


2011 ◽  
Vol 20 (4) ◽  
pp. 121-123
Author(s):  
Jeri A. Logemann

Evidence-based practice requires astute clinicians to blend our best clinical judgment with the best available external evidence and the patient's own values and expectations. Sometimes, we value one more than another during clinical decision-making, though it is never wise to do so, and sometimes other factors that we are unaware of produce unanticipated clinical outcomes. Sometimes, we feel very strongly about one clinical method or another, and hopefully that belief is founded in evidence. Some beliefs, however, are not founded in evidence. The sound use of evidence is the best way to navigate the debates within our field of practice.


Author(s):  
Robyn Swanson

This chapter addresses the use of evidence-based practices (EBPs) by special education practitioners in instruction and assessment while providing music educators guidance toward implementing these practices in instruction and assessment for students with autism spectrum disorder (ASD) within universal design for learning (UDL) inclusive classrooms. Included are behavioral characteristics of students with ASD that music educators need be cognizant of in inclusive settings; federal education laws and policies that have provided students with disabilities rights to a quality education; and selected special education EBP and accommodations deemed as viable interventions for teaching and assessing PreK-12 standards-based music curriculum for students with ASD. Music educators may determine the PreK-12 music assessments aligned to appropriate EBP and accommodations for students with ASD are beneficial resources when designing and implementing curriculum, instruction, and assessment linked to the 2014 National Core Arts (Music) Standards (NCAS) with supporting Model Cornerstone Assessments (MCAs).


2016 ◽  
Vol 30 (2) ◽  
pp. 23-31 ◽  
Author(s):  
Sandra L. Kaplan ◽  
Julie K. Tilson ◽  
David Levine ◽  
Steven Z. George ◽  
Deanne Fay ◽  
...  

2017 ◽  
Vol 66 (1) ◽  
pp. 389-405 ◽  
Author(s):  
Karen Eppley ◽  
Patrick Shannon

We have two goals for this article: to question the efficacy of evidence-based practice as the foundation of reading education policy and to propose practice-based evidence as a viable, more socially just alternative. In order to reach these goals, we describe the limits of reading policies of the last half century and argue for the possibilities of policies aimed at more equitable distribution of academic literacies among all social groups, recognition of subaltern groups’ literacies, and representation of the local in regional and global decision making.


2021 ◽  
Vol 11 (3) ◽  
pp. 129
Author(s):  
Gabrielle Wilcox ◽  
Cristina Fernandez Conde ◽  
Amy Kowbel

There are longstanding calls for inclusive education for all regardless of student need or teacher capacity to meet those needs. Unfortunately, there are little empirical data to support full inclusion for all students and even less information on the role of data-based decision making in inclusive education specifically, even though there is extensive research on the effectiveness of data-based decision making. In this article, we reviewed what data-based decision making is and its role in education, the current state of evidence related to inclusive education, and how data-based decision making can be used to support decisions for students with reading disabilities and those with intellectual disabilities transitioning to adulthood. What is known about evidence-based practices in supporting reading and transition are reviewed in relationship to the realities of implementing these practices in inclusive education settings. Finally, implications for using data-based decisions in inclusive settings are discussed.


Sign in / Sign up

Export Citation Format

Share Document