Sources of evidence for professional decision-making in novice occupational therapy practitioners: clinicians’ perspectives

2020 ◽  
pp. 030802262094139
Author(s):  
Helen Jeffery ◽  
Linda Robertson ◽  
Kim L Reay

Introduction Evidence-based practice skills and habits begin during undergraduate education and continue through professional life. It is important novices learn the skills in their education programme that are required in practice. This study explores strategies experienced occupational therapy supervisors use to encourage novices to be evidence based, and how these might be enhanced. Method Qualitative descriptive methodology was used to explore the views and experiences of 15 experienced supervisors from a range of practice areas and geographical locations, interviewed in four focus groups. Results Evidence-based practice is an element of professional reasoning not isolated from client-centred practice or from reflective practice. Five sources of evidence to inform competence in professional decision-making were identified: (a) research evidence from literature; (b) local environment, resources and culture; (c) client’s expertise and perspective; (d) expertise of others; and (e) practitioners’ own knowledge and experience. Conclusion Intentional use of all five sources of evidence to inform professional decision-making contributes to habits of evidence-based thinking and practice. Experienced therapists and educators can support evidence-based practice in novices by prompting questioning and developing systems supportive of scanning for evidence in each area. Collaboration in this endeavour will enhance integration of academic and practice education.

1997 ◽  
Vol 60 (11) ◽  
pp. 479-483 ◽  
Author(s):  
Katrina Bannigan

Evidence-based health care can be defined as an approach to health care that involves finding and using up-to-date research into the effectiveness of health care interventions to inform decision making (Entwistle et al, 1996). For many occupational therapists, the practicalities of keeping up to date with the best research evidence is difficult; however, through the National Health Service Centre for Reviews and Dissemination (NHS CRD), the NHS Research and Development (R&D) Programme is aiming to improve the availability of high quality research evidence to all health care professionals. The NHS CRD carries out and commissions systematic reviews. Systematic reviews are a means of pulling together large quantities of research information and are considered to be one of the most reliable sources of information about effectiveness (Chalmers and Altman, 1995). The NHS CRD also disseminates the findings of systematic reviews, one method of which is through the Database of Abstracts of Reviews of Effectiveness (DARE). The relevance of systematic reviews to the clinical practice of occupational therapists is explored in this paper using two examples: a poor quality and a high quality systematic review identified from the abstracting process for DARE. Both reviews are directly relevant to occupational therapy, being about sensory integration and falls in the elderly respectively. The implications of these reviews for evidence-based practice in occupational therapy are discussed.


2009 ◽  
Vol 89 (9) ◽  
pp. 918-933 ◽  
Author(s):  
Joe Schreiber ◽  
Perri Stern ◽  
Gregory Marchetti ◽  
Ingrid Provident

BackgroundThe physical therapy profession has been perceived as one that bases its practice largely on anecdotal evidence and that uses treatment techniques for which there is little scientific support. Physical therapists have been urged to increase evidence-based practice behaviors as a means to address this perception and to enhance the translation of knowledge from research evidence into clinical practice. However, little attention has been paid to the best ways in which to support clinicians’ efforts toward improving evidence-based practice.ObjectivesThe purpose of this study was to identify, implement, and evaluate the effectiveness of strategies aimed at enhancing the ability of 5 pediatric physical therapists to integrate scientific research evidence into clinical decision making.DesignThis study was a formative evaluation pilot project.MethodsThe participants in this study collaborated with the first author to identify and implement strategies and outcomes aimed at enhancing their ability to use research evidence during clinical decision making. Outcome data were analyzed with qualitative methods.ResultsThe participants were able to implement several, but not all, of the strategies and made modest self-reported improvements in evidence-based practice behaviors, such as reading journal articles and completing database searches. They identified several barriers, including a lack of time, other influences on clinical decision making, and a lack of incentives for evidence-based practice activities.ConclusionsThe pediatric physical therapists who took part in this project had positive attitudes toward evidence-based practice and made modest improvements in this area. It is critical for the profession to continue to investigate optimal strategies to aid practicing clinicians in applying research evidence to clinical decision making.


2014 ◽  
Vol 1 (1) ◽  
pp. 13-19 ◽  
Author(s):  
George S. Tomlin ◽  
Deborah Dougherty

Abstract Contemporary conditions require health professionals both to employ published evidence in their individual practices and as a profession to produce valid evidence of their outcome effectiveness. Heretofore, these two processes of evidence-based practice have often been confounded as one. This theoretical paper separates the two processes into «Evidence-Supported Practice» and «Evidence-Informed Practice.» Each requires a different approach to evidence accumulation and use. Nonetheless, the two processes can and should be interlinked. For external (research) evidence, the research pyramid model values equally the internal and external validity of studies, as both are important for the implementation of external evidence. Furthermore, external evidence must be combined with internal evidence (data generated in the course of interaction with a client) in the decision-making of practitioners. Examples from recent research on occupational therapy practice and literature from several other health professions are cited for illustration. This paper formulates a more comprehensive model for evidence-based practice. From this model follow specific recommendations for practitioners, researchers, and educators in the health professions.


2012 ◽  
Vol 48 (3) ◽  
pp. 159-166 ◽  
Author(s):  
Valerie L. Mazzotti ◽  
Dawn R. Rowe ◽  
David W. Test

Factors such as the standards-based education movement, mandated participation in statewide testing, and inclusion have forced an increased focus on improving outcomes for students with disabilities. There are many determinants of postschool success for students with disabilities; however, teachers primarily have control over only one, teaching practices and programs. As a result, it is important that teachers choose and implement practices that have proven successful for secondary students with disabilities. This article guides teachers through the process of navigating the evidence-based practice maze to identify evidence-based practices and programs for secondary students with disabilities. Particularly, it addresses the need to (a) follow a research-based framework (i.e., Kohler’s Taxonomy), (b) use practices with the best available research evidence to support effectiveness, and (c) use data-based decision making to guide use of evidence-based practices.


10.2196/17718 ◽  
2020 ◽  
Vol 22 (8) ◽  
pp. e17718
Author(s):  
Monika Jurkeviciute ◽  
Henrik Eriksson

Background Evidence-based practice refers to building clinical decisions on credible research evidence, professional experience, and patient preferences. However, there is a growing concern that evidence in the context of electronic health (eHealth) is not sufficiently used when forming policies and practice of health care. In this context, using evaluation and research evidence in clinical or policy decisions dominates the discourse. However, the use of additional types of evidence, such as professional experience, is underexplored. Moreover, there might be other ways of using evidence than in clinical or policy decisions. Objective This study aimed to analyze how different types of evidence (such as evaluation outcomes [including patient preferences], professional experiences, and existing scientific evidence from other research) obtained within the development and evaluation of an eHealth trial are used by diverse stakeholders. An additional aim was to identify barriers to the use of evidence and ways to support its use. Methods This study was built on a case of an eHealth trial funded by the European Union. The project included 4 care centers, 2 research and development companies that provided the web-based physical exercise program and an activity monitoring device, and 2 science institutions. The qualitative data collection included 9 semistructured interviews conducted 8 months after the evaluation was concluded. The data analysis concerned (1) activities and decisions that were made based on evidence after the project ended, (2) evidence used for those activities and decisions, (3) in what way the evidence was used, and (4) barriers to the use of evidence. Results Evidence generated from eHealth trials can be used by various stakeholders for decisions regarding clinical integration of eHealth solutions, policy making, scientific publishing, research funding applications, eHealth technology, and teaching. Evaluation evidence has less value than professional experiences to local decision making regarding eHealth integration into clinical practice. Professional experiences constitute the evidence that is valuable to the highest variety of activities and decisions in relation to eHealth trials. When using existing scientific evidence related to eHealth trials, it is important to consider contextual relevance, such as location or disease. To support the use of evidence, it is suggested to create possibilities for health care professionals to gain experience, assess a few rather than a large number of variables, and design for shorter iterative cycles of evaluation. Conclusions Initiatives to support and standardize evidence-based practice in the context of eHealth should consider the complexities in how the evidence is used in order to achieve better uptake of evidence in practice. However, one should be aware that the assumption of fact-based decision making in organizations is misleading. In order to create better chances that the evidence produced would be used, this should be addressed through the design of eHealth trials.


2020 ◽  
Author(s):  
Monika Jurkeviciute ◽  
Henrik Eriksson

BACKGROUND Evidence-based practice refers to building clinical decisions on credible research evidence, professional experience, and patient preferences. However, there is a growing concern that evidence in the context of electronic health (eHealth) is not sufficiently used when forming policies and practice of health care. In this context, using evaluation and research evidence in clinical or policy decisions dominates the discourse. However, the use of additional types of evidence, such as professional experience, is underexplored. Moreover, there might be other ways of using evidence than in clinical or policy decisions. OBJECTIVE This study aimed to analyze how different types of evidence (such as evaluation outcomes [including patient preferences], professional experiences, and existing scientific evidence from other research) obtained within the development and evaluation of an eHealth trial are used by diverse stakeholders. An additional aim was to identify barriers to the use of evidence and ways to support its use. METHODS This study was built on a case of an eHealth trial funded by the European Union. The project included 4 care centers, 2 research and development companies that provided the web-based physical exercise program and an activity monitoring device, and 2 science institutions. The qualitative data collection included 9 semistructured interviews conducted 8 months after the evaluation was concluded. The data analysis concerned (1) activities and decisions that were made based on evidence after the project ended, (2) evidence used for those activities and decisions, (3) in what way the evidence was used, and (4) barriers to the use of evidence. RESULTS Evidence generated from eHealth trials can be used by various stakeholders for decisions regarding clinical integration of eHealth solutions, policy making, scientific publishing, research funding applications, eHealth technology, and teaching. Evaluation evidence has less value than professional experiences to local decision making regarding eHealth integration into clinical practice. Professional experiences constitute the evidence that is valuable to the highest variety of activities and decisions in relation to eHealth trials. When using existing scientific evidence related to eHealth trials, it is important to consider contextual relevance, such as location or disease. To support the use of evidence, it is suggested to create possibilities for health care professionals to gain experience, assess a few rather than a large number of variables, and design for shorter iterative cycles of evaluation. CONCLUSIONS Initiatives to support and standardize evidence-based practice in the context of eHealth should consider the complexities in how the evidence is used in order to achieve better uptake of evidence in practice. However, one should be aware that the assumption of fact-based decision making in organizations is misleading. In order to create better chances that the evidence produced would be used, this should be addressed through the design of eHealth trials.


2020 ◽  
Vol 29 (2) ◽  
pp. 688-704
Author(s):  
Katrina Fulcher-Rood ◽  
Anny Castilla-Earls ◽  
Jeff Higginbotham

Purpose The current investigation is a follow-up from a previous study examining child language diagnostic decision making in school-based speech-language pathologists (SLPs). The purpose of this study was to examine the SLPs' perspectives regarding the use of evidence-based practice (EBP) in their clinical work. Method Semistructured phone interviews were conducted with 25 school-based SLPs who previously participated in an earlier study by Fulcher-Rood et al. 2018). SLPs were asked questions regarding their definition of EBP, the value of research evidence, contexts in which they implement scientific literature in clinical practice, and the barriers to implementing EBP. Results SLPs' definitions of EBP differed from current definitions, in that SLPs only included the use of research findings. SLPs seem to discuss EBP as it relates to treatment and not assessment. Reported barriers to EBP implementation were insufficient time, limited funding, and restrictions from their employment setting. SLPs found it difficult to translate research findings to clinical practice. SLPs implemented external research evidence when they did not have enough clinical expertise regarding a specific client or when they needed scientific evidence to support a strategy they used. Conclusions SLPs appear to use EBP for specific reasons and not for every clinical decision they make. In addition, SLPs rely on EBP for treatment decisions and not for assessment decisions. Educational systems potentially present other challenges that need to be considered for EBP implementation. Considerations for implementation science and the research-to-practice gap are discussed.


2011 ◽  
Vol 20 (4) ◽  
pp. 121-123
Author(s):  
Jeri A. Logemann

Evidence-based practice requires astute clinicians to blend our best clinical judgment with the best available external evidence and the patient's own values and expectations. Sometimes, we value one more than another during clinical decision-making, though it is never wise to do so, and sometimes other factors that we are unaware of produce unanticipated clinical outcomes. Sometimes, we feel very strongly about one clinical method or another, and hopefully that belief is founded in evidence. Some beliefs, however, are not founded in evidence. The sound use of evidence is the best way to navigate the debates within our field of practice.


Sign in / Sign up

Export Citation Format

Share Document