scholarly journals Evidence-Based Educational Practice

Author(s):  
Tone Kvernbekk

Evidence-based practice (EBP) is a buzzword in contemporary professional debates, for example, in education, medicine, psychiatry, and social policy. It is known as the “what works” agenda, and its focus is on the use of the best available evidence to bring about desirable results or prevent undesirable ones. We immediately see here that EBP is practical in nature, that evidence is thought to play a central role, and also that EBP is deeply causal: we intervene into an already existing practice in order to produce an output or to improve the output. If our intervention brings the results we want, we say that it “works.” How should we understand the causal nature of EBP? Causality is a highly contentious issue in education, and many writers want to banish it altogether. But causation denotes a dynamic relation between factors and is indispensable if one wants to be able to plan the attainment of goals and results. A nuanced and reasonable understanding of causality is therefore necessary to EBP, and this we find in the INUS-condition approach. The nature and function of evidence is much discussed. The evidence in question is supplied by research, as a response to both political and practical demands that educational research should contribute to practice. In general, evidence speaks to the truth value of claims. In the case of EBP, the evidence emanates from randomized controlled trials (RCTs) and presumably speaks to the truth value of claims such as “if we do X, it will lead to result Y.” But what does research evidence really tell us? It is argued here that a positive RCT result will tell you that X worked where the RCT was conducted and that an RCT does not yield general results. Causality and evidence come together in the practitioner perspective. Here we shift from finding causes to using them to bring about desirable results. This puts contextual matters at center stage: will X work in this particular context? It is argued that much heterogeneous contextual evidence is required to make X relevant for new contexts. If EBP is to be a success, research evidence and contextual evidence must be brought together.

2021 ◽  
Vol 15 ◽  
Author(s):  
Philip M. Newton ◽  
Hannah Farukh Najabat-Lattif ◽  
Gabriella Santiago ◽  
Atharva Salvi

Learning Styles theory promises improved academic performance based on the identification of a personal, sensory preference for informational processing. This promise is not supported by evidence, and is in contrast to our current understanding of the neuroscience of learning. Despite this lack of evidence, prior research shows that that belief in the Learning Styles “neuromyth” remains high amongst educators of all levels, around the world. This perspective article is a follow up on prior research aimed at understanding why belief in the neuromyth of Learning Styles remains so high. We evaluated current research papers from the field of health professions education, to characterize the perspective that an educator would be given, should they search for evidence on Learning Styles. As in earlier research on Higher Education, we found that the use of Learning Style frameworks persist in education research for the health professions; 91% of 112 recent research papers published on Learning Styles are based upon the premise that Learning Styles are a useful approach to education. This is in sharp contrast to the fundamental principle of evidence-based practice within these professions. Thus any educator who sought out the research evidence on Learning Styles would be given a consistent but inaccurate endorsement of the value of a teaching technique that is not evidence based, possibly then propagating the belief in Learning Styles. Here we offer perspectives from both research and student about this apparent mismatch between educational practice and clinical practice, along with recommendations and considerations for the future.


Author(s):  
Ron LeFebvre ◽  
David Peterson ◽  
Mitchell Haas

Evidence-based practice has had a growing impact on chiropractic education and the delivery of chiropractic care. For evidence-based practice to penetrate and transform a profession, the penetration must occur at 2 levels. One level is the degree to which individual practitioners possess the willingness and basic skills to search and assess the literature. Chiropractic education received a significant boost in this realm in 2005 when the National Center for Complementary and Alternative Medicine awarded 4 chiropractic institutions R25 education grants to strengthen their research/evidence-based practice curricula. The second level relates to whether the therapeutic interventions commonly employed by a particular health care discipline are supported by clinical research. A growing body of randomized controlled trials provides evidence of the effectiveness and safety of manual therapies.


2021 ◽  
pp. 198-200
Author(s):  
Sharanya Bose. ◽  
Subhapriya Mandal ◽  
Ravi Prakash B S ◽  
Himadri Chakrabarty ◽  
Abhijit Chakraborty

Research in the eld of periodontology has observed a huge upheaval in the last two decades unveiling newer alterations in techniques, methodologies, and material science. The recent centre of attention in periodontal research is an evidence-based approach which offers a bridge from science to clinical practice. Research inculcates scientic and inductive thinking and it promotes the development of logical habits of thinking and organization. In terms of research methodology, the article aim to inform the reader on topics relating to randomized controlled trials in periodontal research, evidence-based dentistry, calibration of clinical examiners and statistics relevant to periodontal research.


Author(s):  
John C. Norcross ◽  
Thomas P. Hogan ◽  
Gerald P. Koocher ◽  
Lauren A. Maggio

Moving research evidence from science to service, from the lab bench to the bedside, poses a challenge for evidence-based practices (EBPs). Translation(al) research inclusively refers to the process of successfully moving research-supported discoveries into established practice and policy. This chapter begins with synopses of the empirical research on predicting adoption of EBP and the barriers to its implementation. The chapter then reviews effective methods for disseminating, teaching, and implementing EBPs. Like EBP itself, the new field of implementation science sensitively integrates the best research evidence, clinical expertise, and staff characteristics and preferences into deciding what works in each unique healthcare system.


2020 ◽  
Vol 5 ◽  
Author(s):  
Philip M. Newton ◽  
Ana Da Silva ◽  
Sam Berry

Arguments for and against the idea of evidence-based education have occupied the academic literature for decades. Those arguing in favor plead for greater rigor and clarity to determine “what works.” Those arguing against protest that education is a complex, social endeavor and that for epistemological, theoretical and political reasons it is not possible to state, with any useful degree of generalizable certainty, “what works.” While academics argue, policy and practice in Higher Education are beset with problems. Ineffective methods such as “Learning Styles” persist. Teaching quality and teacher performance are measured using subjective and potentially biased feedback. University educators have limited access to professional development, particularly for practical teaching skills. There is a huge volume of higher education research, but it is disconnected from educational practice. Change is needed. We propose a pragmatic model of Evidence-Based Higher Education, empowering educators and others to make judgements about the application of the most useful evidence, in a particular context, including pragmatic considerations of cost and other resources. Implications of the model include a need to emphasize pragmatic approaches to research in higher education, delivering results that are more obviously useful, and a pragmatic focus on practical teaching skills for the development of educators in Higher Education.


Author(s):  
Nick Zepke

Hotly contested debates about evidence-based educational research, policy development and practice have become a feature of the educational landscape in New Zealand as elsewhere. Advocates argue that applying scientifically established research evidence of what works is the way to improve educational quality and student outcomes. Governments in the United States, United Kingdom and New Zealand support the development of scientific evidence-based policy and practice that shows what works. Doubters are not questioning the importance of scientific research evidence. Indeed it seems untenable to deny the centrality of evidence in decision-making about what works in education. Rather, sceptics and opponents question meanings of key terms like “science”, “evidence” and “quality”. They question the politics behind evidence-based research, assumptions about the nature of evidence, science and research methodology and whether research that aims to provide universal answers actually works. This article canvasses these questions. Written from a sceptical perspective, it draws on experiences from the United States, the United Kingdom and New Zealand.


2019 ◽  
Vol 57 (3) ◽  
pp. 1045-1082 ◽  
Author(s):  
Kathryn E. Joyce ◽  
Nancy Cartwright

This article addresses the gap between what works in research and what works in practice. Currently, research in evidence-based education policy and practice focuses on randomized controlled trials. These can support causal ascriptions (“It worked”) but provide little basis for local effectiveness predictions (“It will work here”), which are what matter for practice. We argue that moving from ascription to prediction by way of causal generalization (“It works”) is unrealistic and urge focusing research efforts directly on how to build local effectiveness predictions. We outline various kinds of information that can improve predictions and encourage using methods better equipped for acquiring that information. We compare our proposal with others advocating a better mix of methods, like implementation science, improvement science, and practice-based evidence.


2020 ◽  
Vol 29 (2) ◽  
pp. 688-704
Author(s):  
Katrina Fulcher-Rood ◽  
Anny Castilla-Earls ◽  
Jeff Higginbotham

Purpose The current investigation is a follow-up from a previous study examining child language diagnostic decision making in school-based speech-language pathologists (SLPs). The purpose of this study was to examine the SLPs' perspectives regarding the use of evidence-based practice (EBP) in their clinical work. Method Semistructured phone interviews were conducted with 25 school-based SLPs who previously participated in an earlier study by Fulcher-Rood et al. 2018). SLPs were asked questions regarding their definition of EBP, the value of research evidence, contexts in which they implement scientific literature in clinical practice, and the barriers to implementing EBP. Results SLPs' definitions of EBP differed from current definitions, in that SLPs only included the use of research findings. SLPs seem to discuss EBP as it relates to treatment and not assessment. Reported barriers to EBP implementation were insufficient time, limited funding, and restrictions from their employment setting. SLPs found it difficult to translate research findings to clinical practice. SLPs implemented external research evidence when they did not have enough clinical expertise regarding a specific client or when they needed scientific evidence to support a strategy they used. Conclusions SLPs appear to use EBP for specific reasons and not for every clinical decision they make. In addition, SLPs rely on EBP for treatment decisions and not for assessment decisions. Educational systems potentially present other challenges that need to be considered for EBP implementation. Considerations for implementation science and the research-to-practice gap are discussed.


2008 ◽  
Vol 17 (3) ◽  
pp. 110-118 ◽  
Author(s):  
Joan C. Arvedson

Abstract “Food for Thought” provides an opportunity for review of pertinent topics to add to updates in areas of concern for professionals involved with feeding and swallowing issues in infants and children. Given the frequency with which speech-language pathologists (SLPs) make decisions to alter feedings when young infants demonstrate silent aspiration on videofluoroscopic swallow studies (VFSS), the need for increased understanding about cough and its development/maturation is a high priority. In addition, understanding of the role(s) of laryngeal chemoreflexes (LCRs), relationships (or lack of relationships) between cough and esophagitis, gastroesophageal reflux (GER), and chronic salivary aspiration is critical. Decision making regarding management must take into account multiple systems and their interactions in order to provide safe feeding for all children to meet nutrition and hydration needs without being at risk for pulmonary problems. The responsibility is huge and should encourage all to search the literature so that clinical practice is as evidence-based as possible; this often requires adequate understanding of developmentally appropriate neurophysiology and function.


Sign in / Sign up

Export Citation Format

Share Document