scholarly journals Effects of using curriculum-based measurement (CBM) for progress monitoring in reading and an additive reading instruction in second classes

2020 ◽  
Vol 13 (1) ◽  
pp. 151-166
Author(s):  
Sven Anderson ◽  
Jana Jungjohann ◽  
Markus Gebhardt
2017 ◽  
Vol 84 (1) ◽  
pp. 42-54 ◽  
Author(s):  
Joseph Jenkins ◽  
Margaret Schulze ◽  
Allison Marti ◽  
Allen G. Harbaugh

We examined the idea that leaner schedules of progress monitoring (PM) can lighten assessment demands without undermining decision-making accuracy. Using curriculum-based measurement of reading, we compared effects on decision accuracy of 5 intermittent PM schedules relative to that of every-week PM. For participating students with high-incidence disabilities—all receiving special education reading instruction ( N = 56)—intermittent schedules of PM performed as well as every-week PM. These findings signal a need for research on the relative accuracy and timeliness of curriculum-based measurement decision making for intermittent and weekly PM.


2017 ◽  
Vol 36 (1) ◽  
pp. 55-73 ◽  
Author(s):  
Theodore J. Christ ◽  
Christopher David Desjardins

Curriculum-Based Measurement of Oral Reading (CBM-R) is often used to monitor student progress and guide educational decisions. Ordinary least squares regression (OLSR) is the most widely used method to estimate the slope, or rate of improvement (ROI), even though published research demonstrates OLSR’s lack of validity and reliability, and imprecision of ROI estimates, especially after brief duration of monitoring (6-10 weeks). This study illustrates and examines the use of Bayesian methods to estimate ROI. Conditions included four progress monitoring durations (6, 8, 10, and 30 weeks), two schedules of data collection (weekly, biweekly), and two ROI growth distributions that broadly corresponded with ROIs for general and special education populations. A Bayesian approach with alternate prior distributions for the ROIs is presented and explored. Results demonstrate that Bayesian estimates of ROI were more precise than OLSR with comparable reliabilities, and Bayesian estimates were consistently within the plausible range of ROIs in contrast to OLSR, which often provided unrealistic estimates. Results also showcase the influence the priors had estimated ROIs and the potential dangers of prior distribution misspecification.


2019 ◽  
Vol 55 (2) ◽  
pp. 113-119
Author(s):  
Esther R. Lindström ◽  
Samantha A. Gesel ◽  
Christopher J. Lemons

Students with severe and persistent academic or behavioral challenges may benefit from data-based individualization (DBI). Starting with an evidence-based standard protocol and systematic progress monitoring, teachers can evaluate growth and implement individualized interventions to meet students’ needs. Specifically, this article addresses the systematic use of student data to determine content and pacing for intensive reading instruction. Insights from implementing this approach with struggling first grade readers in Tier 3 of an RTI framework are provided. Evidence-based standard protocols, strategic data collection and management, and team collaboration are crucial elements for successful implementation.


2017 ◽  
Vol 36 (1) ◽  
pp. 74-81 ◽  
Author(s):  
John M. Hintze ◽  
Craig S. Wells ◽  
Amanda M. Marcotte ◽  
Benjamin G. Solomon

This study examined the diagnostic accuracy associated with decision making as is typically conducted with curriculum-based measurement (CBM) approaches to progress monitoring. Using previously published estimates of the standard errors of estimate associated with CBM, 20,000 progress-monitoring data sets were simulated to model student reading growth of two-word increase per week across 15 consecutive weeks. Results indicated that an unacceptably high proportion of cases were falsely identified as nonresponsive to intervention when a common 4-point decision rule was applied, under the context of typical levels of probe reliability. As reliability and stringency of the decision-making rule increased, such errors decreased. Findings are particularly relevant to those who use a multi-tiered response-to-intervention model for evaluating formative changes associated with instructional intervention and evaluating responsiveness to intervention across multiple tiers of intervention.


2018 ◽  
Vol 34 (1) ◽  
pp. 41-51
Author(s):  
Francesca G. Jones ◽  
Diane Gifford ◽  
Paul Yovanoff ◽  
Stephanie Al Otaiba ◽  
Dawn Levy ◽  
...  

As part of standards-based reforms, there is increasing emphasis on ensuring that students with moderate intellectual disabilities (ID), including students with Autism Spectrum Disorders (ASD), learn to read. There is also converging evidence that explicit teaching of letter sounds, phonics, and sight words is effective for this population, but that students’ responsiveness varies. A critical part of individualizing reading instruction for students with disabilities is the reliable assessment of progress and mastery of reading skills. However, assessment of many students with ID and students with ASD is challenging because of attention, behavioral, and communication issues related to testing situation; therefore, obtaining consistent results often proves to be a difficult task. We hypothesized that alternate assessment presentation formats, as a testing accommodation, would improve the reliability, validity, and consistency of assessment performance. In this study, three different presentation formats—word lists, flash cards, and PowerPoint presentation—were used when administering proximal, curriculum-based reading assessments to determine whether a particular format increased student engagement, reduced the need for prompts, and increased accuracy of identifying known items on the test. While statistical analyses did not support the hypothesis of a format by student effect, visual analysis of the data did suggest that the number of prompts required varied by student as a function of assessment format. Most noteworthy, assessment reliability, estimated with generalizability theory, indicated that reliability increased as a function of format by student.


Sign in / Sign up

Export Citation Format

Share Document