scholarly journals The Editor’s Note: The beginning of a beautiful friendship?

2021 ◽  
Vol 102 (7) ◽  
pp. 4-4
Author(s):  
Rafael Heller

Since the field of education research emerged, complaints have proliferated about its quality and researchers’ failure to share findings with practitioners. Federal initiatives such as the What Works Clearinghouse have sought to increase the availability of research, but many researchers have continued to be disconnected from practicing K-12 educators. Rafael Heller explains that the research-practice partnerships described in the April 2021 Kappan show promise for bridging the divide.

2021 ◽  
Vol 45 (1) ◽  
pp. 101-128
Author(s):  
Joseph A. Taylor ◽  
Elisabeth Davis ◽  
Laura E. Michaelson

In this chapter, we describe and compare the standards for evidence used by three entities that review studies of education interventions: Blueprints for Healthy Youth Development, Social Programs that Work, and the What Works Clearinghouse. Based on direct comparisons of the evidence frameworks, we identify key differences in the level at which effectiveness ratings are granted (i.e., intervention vs. outcome domain), as well as in how each entity prioritizes intervention documentation, researcher independence, and sustained versus immediate effects. Because such differences in priorities may result in contradictory intervention ratings between entities, we offer a number of recommendations for a common set of standards that would harmonize effectiveness ratings across the three entities while preserving differences that allow for variation in user priorities. These include disentangling study rigor from intervention effectiveness, ceasing vote counting procedures, adding replication criteria, adding fidelity criteria, assessing baseline equivalence for randomized studies, making quasi-experiments eligible for review, adding criteria for researcher independence, and providing effectiveness ratings at the level of the outcome domain rather than the intervention.


2020 ◽  
Author(s):  
Corey Peltier ◽  
Tiffany K Peltier ◽  
Taylor Werthen ◽  
Andy Heuer

Access to high-quality resources is integral for educators to provide research-aligned mathematics instruction. Identifying the supplemental resources educators use to plan mathematics instruction can inform the ways researchers and organizations disseminate research-based practices. The goal of this study was to identify the frequency in which early childhood educators (i.e., pre-Kindergarten through third grade) reported using various resources to plan for mathematics instruction. Furthermore, we investigated whether differences were observed based on teacher factors (i.e., general or special education, route to certification, years of experience) and locale (i.e., rural, urban, suburban). We retained data from 917 teachers for data analysis. The three most frequently reported resources by educators were colleagues, Teachers Pay Teachers, and Google/Yahoo. The three least frequently reported resources were the typical outlets researchers use to reach teachers: What Works Clearinghouse, Teaching Exceptional Children, and Teaching Children Mathematics. General and special education teachers differed on their self-reported usage of five resources: colleagues, Google/Yahoo, teaching blogs, Teaching Exceptional Children, and the What Works Clearinghouse. Rural educators self-reported that they were less likely than suburban educators to use colleagues or specialists at the district to plan instruction. Implications for future research and practice are discussed.


Author(s):  
Tamara J. Moore ◽  
Aran W. Glancy ◽  
Kristina M. Tank ◽  
Jennifer A. Kersten ◽  
Karl A. Smith ◽  
...  

2010 ◽  
Vol 8 (1) ◽  
Author(s):  
Nathaniel M. Rickles ◽  
Todd A. Brown ◽  
Melissa S. McGivney ◽  
Margie E. Snyder ◽  
Kelsey A. White

2016 ◽  
Vol 45 (8) ◽  
pp. 454-459 ◽  
Author(s):  
David B. Malouf ◽  
Juliana M. Taymans

An analysis was conducted of the What Works Clearinghouse (WWC) research evidence base on the effectiveness of replicable education interventions. Most interventions were found to have little or no support from technically adequate research studies, and intervention effect sizes were of questionable magnitude to meet education policy goals. These findings painted a dim picture of the evidence base on education interventions and indicated a need for new approaches, including a reexamination of federal reliance on experimental impact research as the basis for gauging intervention effectiveness.


Sign in / Sign up

Export Citation Format

Share Document