scholarly journals The Immediate Feedback Assessment Technique: A Learner-centered Multiple-choice Response Form

2013 ◽  
Vol 35 (4) ◽  
pp. 111-131
Author(s):  
David DiBattista

The Immediate Feedback Assessment Technique (IFAT) is a new multiple-choice response form that has advantages over more commonly used response techniques. The IFAT, which is commercially available at reasonable cost and can be used conveniently with large classes, has an answer-until-correct format that provides students with immediate, corrective, item-by-item feedback. Advantages of this learner-centered response form are that it: (a) actively promotes learning; (b) allows students’ partial knowledge to be rewarded with partial credit; (c) is strongly preferred by students over other response techniques; and (d) lets instructors more easily maintain the security of multiple choice (MC) items so that they can be reused from one semester to the next. The IFAT’s major shortcoming is that grading must be done manually because it does not yet have a compatible optical scanning device. Helpful suggestions are presented for instructors who may be considering using the IFAT for the first time. RÉSUMÉ La Technique D’Évaluation Immédiat (Immediate Feedback Assessment Technique ou IFAT) est un nouveau formulaire pour examens à choix multiple qui a plusieurs avantages. Le IFAT, disponible à un prix raisonable et convenable pour les cours suivis par de nombreux étudiants, est constitué d’un format dans lequel les édudiants selectionnent alternative-par- alternative parmi les choix disponibles jusqu’à ce que la réponse correcte soit indiquée. En suite, la correction est automatique et informe la réponse correcte immédiatement. Le IFAT a plusieurs avantages: (a) il favorise l’apprentissage; (b) les étudiants peuvent obtenir des points partiels avec connaissances partiels; (c) les étudiants préferent ce formulaire à comparer à autres formats à choix multiple; et (d) les instructeurs peuvent maintenir plus facilement leurs questions et alternatives en sécurité et les réutiliser au cours des prochaines sessions. Le défaut principal du IFAT est que la notation est manuele car il n’y a pas encore de lecteur optique compatible avec ce formulaire. Des suggestions utiles sont ici données pour les instructeurs qui envisagent d’utiliser cette technique pour la première fois.

2002 ◽  
Vol 90 (1) ◽  
pp. 226-226 ◽  
Author(s):  
Michael L. Epstein ◽  
Gary M. Brosvic

A multiple-choice testing system that provides immediate affirming or corrective feedback and permits allocation of partial credit for proximate knowledge is suggested as an alternative to essay examinations.


2001 ◽  
Vol 88 (3) ◽  
pp. 889-894 ◽  
Author(s):  
Michael L. Epstein ◽  
Beth B. Epstein ◽  
Gary M. Brosvic

Performance on two multiple-choice testing procedures was examined during unit tests and a final examination. The Immediate Feedback Assessment Technique provided immediate response feedback in an answer-until-correct style of responding. The testing format which served as a point of comparison was the Scantron form. One format was completed by students in introductory psychology courses during unit tests whereas all students used the Scantron form on the final examination. Students tested with Immediate Feedback forms on the unit tests correctly answered more of the final examination questions which were repeated from earlier unit tests than did students tested with Scantron forms. Also, students tested with Immediate Feedback forms correctly answered more final examination questions previously answered incorrectly on the unit tests than did students tested previously with Scantron forms.


2017 ◽  
Vol 16 (1) ◽  
pp. ar7 ◽  
Author(s):  
Xiaoying Xu ◽  
Jennifer E. Lewis ◽  
Jennifer Loertscher ◽  
Vicky Minderhout ◽  
Heather L. Tienson

Multiple-choice assessments provide a straightforward way for instructors of large classes to collect data related to student understanding of key concepts at the beginning and end of a course. By tracking student performance over time, instructors receive formative feedback about their teaching and can assess the impact of instructional changes. The evidence of instructional effectiveness can in turn inform future instruction, and vice versa. In this study, we analyzed student responses on an optimized pretest and posttest administered during four different quarters in a large-enrollment biochemistry course. Student performance and the effect of instructional interventions related to three fundamental concepts—hydrogen bonding, bond energy, and pKa—were analyzed. After instructional interventions, a larger proportion of students demonstrated knowledge of these concepts compared with data collected before instructional interventions. Student responses trended from inconsistent to consistent and from incorrect to correct. The instructional effect was particularly remarkable for the later three quarters related to hydrogen bonding and bond energy. This study supports the use of multiple-choice instruments to assess the effectiveness of instructional interventions, especially in large classes, by providing instructors with quick and reliable feedback on student knowledge of each specific fundamental concept.


PEDIATRICS ◽  
1961 ◽  
Vol 28 (1) ◽  
pp. 106-106
Author(s):  
A. ASHLEY WEECH

This compact manual written by the Executive Secretary and the Director of Testing Services of the National Board of Medical Examiners presents in eight brief chapters the methods used by the Board in preparing its objective tests. There is also an appendix containing a sample test that could provide helpful practice to someone about to be exposed for the first time to this type of examination. That such practice is desirable is evident from the fact that foreign students who have previously encountered only essay-type examinations will on the average earn significantly lower grades on a multiple-choice examination than a comparably educated group of American students who have been exposed many times in high school, college and medical school to the techniques of objective testing.


2016 ◽  
Vol 125 (5) ◽  
pp. 1046-1055 ◽  
Author(s):  
Huaping Sun ◽  
Yan Zhou ◽  
Deborah J. Culley ◽  
Cynthia A. Lien ◽  
Ann E. Harman ◽  
...  

Abstract Background As part of the Maintenance of Certification in Anesthesiology Program® (MOCA®), the American Board of Anesthesiology (Raleigh, North Carolina) developed the MOCA Minute program, a web-based intensive longitudinal assessment involving weekly questions with immediate feedback and links to learning resources. This observational study tested the hypothesis that individuals who participate in the MOCA Minute program perform better on the MOCA Cognitive Examination (CE) compared with those who do not participate. Methods Two separate cohorts of individuals eligible for July 2014 and January 2015 CEs were invited to participate in this pilot. The CE scores for each cohort were compared between those who did and did not participate, controlling for the factors known to affect performance. For the first cohort, examination performances for topics covered and not covered by the MOCA Minute were analyzed separately. Results Six hundred sixteen diplomates in July 2014 and 684 diplomates in January 2015 took the CE for the first time. In multiple regression analysis, those actively participating scored 9.9 points (95% CI, 0.8 to 18.9) and 9.3 points (95% CI, 2.3 to 16.3) higher when compared with those not enrolled, respectively. Compared to the group that did not enroll in MOCA Minute, those who enrolled but did not actively participate demonstrated no improvement in scores. MOCA Minute participation was associated with improvement in both questions covering topics included the MOCA Minute and questions not covering these topics. Conclusions This analysis provides evidence that voluntary active participation in a program featuring frequent knowledge assessments accompanied by targeted learning resources is associated with improved performance on a high-stakes CE.


Sign in / Sign up

Export Citation Format

Share Document