Back to the future: Considerations in use and reporting of the retrospective pretest

2019 ◽  
Vol 44 (2) ◽  
pp. 184-191
Author(s):  
Laura G. Hill

Retrospective pretests ask respondents to report after an intervention on their aptitudes, knowledge, or beliefs before the intervention. A primary reason to administer a retrospective pretest is that in some situations, program participants may over the course of an intervention revise or recalibrate their prior understanding of program content, with the result that their posttest scores are lower than their traditional pretest scores, even though their understanding or abilities have increased. This phenomenon is called response-shift bias. The existence of response-shift bias is undisputed, but it does not always occur, and use of the retrospective pretest in place of a traditional pretest often introduces new problems. In this commentary, I provide a brief overview of the literature on response-shift bias and discuss common pitfalls in the use and reporting of retrospective pretest results, including a failure to consider multiple factors that may affect all test scores, as well as claims that retrospective pretests are less biased than traditional pretests, provide more accurate estimates of effects, and are necessarily superior to traditional pretests in program evaluation. I comment on the article by Little et al. (2019) in this issue in light of the literature on retrospective pretests and discuss the need for a theoretical framework to guide research on response-shift bias. The goal of the commentary is to provide readers with an informed and critical lens through which to evaluate and use retrospective pretest methods.

2019 ◽  
Vol 14 (1) ◽  
pp. 216-229
Author(s):  
Jill Young ◽  
Leanne Kallemeyn

Practitioners and evaluators face several constraints in conducting rigorous evaluations to determine program effect. Researchers have offered the retrospective pretest/posttest design as a remedy to curb response-shift bias and better estimate program effects. This article presents an example of how After School Matters (ASM) tested the use of retrospective pretest/posttest design for evaluating out-of-school time (OST) programs for high school youth participants. Differences between traditional pretest and retrospective pretest scores were statistically significant, but effect sizes were negligible, indicating that both pretests yielded similar results. Interviews with youth led to 3 key findings that have implications for ASM using retrospective pretests with youth: response-shift bias was more prominent in youth interviews than in quantitative findings, youth recommended reordering the questions so that the retrospective pretest appears first to increase comprehension, and acquiescence bias emerged in the interviews. This study demonstrates that the retrospective pretest/posttest design can be an alternative to the traditional pretest/posttest design for OST at ASM. These findings are important for ASM and other youth-serving organizations, which often have limited capacity to survey youth multiple times within 1 program session.


1989 ◽  
Vol 7 (2-3) ◽  
pp. 153-166 ◽  
Author(s):  
Mirjam Sprangers

1985 ◽  
Vol 57 (2) ◽  
pp. 525-526 ◽  
Author(s):  
Thomas Nicholson ◽  
Philip A. Belcastro ◽  
Robert S. Gold

Traditional pretest-posttest comparisons of self-report data are distorted by response-shift bias. Administration of a retrospective pretest in lieu of the traditional pretest eliminates a form of response-shift bias which distorts the comparability of pretest-posttest measurements. The present study compared the sensitivity of a retrospective pretest-posttest measurement versus a traditional pretest-posttest measurement in detecting a treatment effect for a university stress counseling program. The substitution of the retrospective pretest for the traditional pretest as the covariate in the analysis of covariance yielded the same conclusion of no treatment effect.


1992 ◽  
Vol 70 (3) ◽  
pp. 699-704 ◽  
Author(s):  
Hans Goedhart ◽  
Johan Hoogstraten

To test the effect of training programs, subjects are often given self-report questionnaires. When such self-report instruments are used, response-shift bias is a possible confounder of results. According to the response-shift theory, pretest information given to subjects might reduce or diminish response-shift bias. In this experiment pretest information was given prior to a course in leadership. Contrary to expectation, the subjects' ratings were somewhat lower in the pretest-information condition. It was concluded that the pretest information provided was not effective enough. Use of a correct amount of reliable pretest information is recommended, otherwise this information may function as a ‘minicourse’ in itself and thereby confound the hypothesized or expected results.


1987 ◽  
Vol 61 (2) ◽  
pp. 579-585 ◽  
Author(s):  
Mirjam Sprangers ◽  
Johan Hoogstraten

The influence of response-style effects on subjects' self-ratings was assessed, utilizing a bogus-pipeline technique. The design was a pretest-posttest design, including retrospective preratings. Subjects were 73 psychology freshmen fulfilling a course requirement. Subjects were led to believe that the veracity of their self-reports could be checked by means of objective measures. The bogus-pipeline induction in the pretesting did lower self-reported preratings and consequently eliminated response-shift bias, defined as a significant mean difference between conventional and retrospective preratings. The results show no contradiction of earlier research. Since a response shift occurred in the experimental nonbogus-pipeline condition, it was concluded that use of a bogus-pipeline technique can improve self-reported pretest scores and subsequently eliminate response-shift-bias effects. Data furthermore show that the retrospective pretest is rather robust for procedural differences.


Author(s):  
Jens Beckert ◽  
Richard Bronk

This chapter provides a theoretical framework for considering how imaginaries and narratives interact with calculative devices to structure expectations and beliefs in the economy. It analyses the nature of uncertainty in innovative market economies and examines how economic actors use imaginaries, narratives, models, and calculative practices to coordinate and legitimize action, determine value, and establish sufficient conviction to act despite the uncertainty they face. Placing the themes of the volume in the context of broader trends in economics and sociology, the chapter argues that, in conditions of widespread radical uncertainty, there is no uniquely rational set of expectations, and there are no optimal strategies or objective probability functions; instead, expectations are often structured by contingent narratives or socially constructed imaginaries. Moreover, since expectations are not anchored in a pre-existing future reality but have an important role in creating the future, they become legitimate objects of political debate and crucial instruments of power in markets and societies.


1980 ◽  
Vol 4 (1) ◽  
pp. 93-106 ◽  
Author(s):  
George S. Howard

Author(s):  
Shelley Anne Doucet ◽  
Diane MacKenzie ◽  
Elaine Loney ◽  
Anne Godden-Webster ◽  
Heidi Lauckner ◽  
...  

Background: The Dalhousie Health Mentors Program (DHMP) is a community-based, pre-licensure interprofessional education initiative that aims to prepare health professional students for collaborative practice in the care of patients with chronic conditions. This program evaluation explores the students’ 1) learning and plans to incorporate skills into future practice; 2) ratings of program content, delivery, and assignments; 3) perspectives of curricular factors that inadvertently acted as barriers to learning; and 4) program improvement suggestions.Methods: All students (N = 745) from the 16 participating health programs were invited to complete an online mixed methods program evaluation survey at the conclusion of the 2012–2013 DHMP. A total of 295 students (40% response rate) responded to the Likert-type questions analyzed using descriptive and non-parametric statistics. Of these students, 204 (69%) provided responses to 10 open-ended questions, which were analyzed thematically.Findings: While the majority of respondents agreed that they achieved the DHMP learning objectives, the mixed-methods approach identified curriculum integration, team composition, and effectiveness of learning assignments as factors that unintentionally acted as barriers to learning, with three key student recommendations for program improvement.Conclusions: Educators and program planners need to be aware that even well-intended learning activities may result in unintended experiences that hamper interprofessional learning.


1988 ◽  
Vol 62 (1) ◽  
pp. 11-16 ◽  
Author(s):  
Mirjam Sprangers ◽  
Johan Hoogstraten

In an earlier study by Sprangers and Hoogstraten, a bogus-pipeline induction did remove response-style effects in the self-reported pretest. Response-shift bias, defined as a significant mean difference between conventional pre- and retrospective preratings, was consequently eliminated. It was concluded that response-style effects in the pretesting are a likely cause of response-shift bias. The present experiment was designed to examine whether these results are stable and generalizable to a different educational training. The present replication made use of the same bogus-pipeline procedure. The experimental training was a First Aid instructional film. Subjects were 53 freshmen in psychology who were fulfilling a course requirement. Contrary to expectation, a bogus-pipeline induction did not lower self-reported preratings. A response-shift did not occur in the bogus-pipeline or in the non-bogus-pipeline conditions. It was concluded that a construct not susceptible to removal of response-style effects is not susceptible to response-shift bias either. Results are consonant with the response-style basis for response-shift bias and show no contradiction of the former study. Data furthermore show that the administration of an objective pretest had no effect on subsequent objective posttreatment scores.


Sign in / Sign up

Export Citation Format

Share Document