The optimal number of options for multiple-choice questions on high-stakes tests: application of a revised index for detecting nonfunctional distractors

2018 ◽  
Vol 24 (1) ◽  
pp. 141-150 ◽  
Author(s):  
Mark R. Raymond ◽  
Craig Stevens ◽  
S. Deniz Bucak

Best of Five MCQs for the Acute Medicine SCE is a new revision resource designed specifically for this high-stakes exam. Containing over 350 Best of Five multiple choice questions, this dedicated guide will help candidates to prepare successfully. The content mirrors the SCE in Acute Medicine Blueprint to ensure candidates are fully prepared for all the topics that may appear in the exam. Topics range from how to manage acute problems in cardiology or neurology to managing acute conditions such as poisoning. All answers have full explanations and further reading to ensure high quality self-assessment and quick recognition of areas that require further study.


2006 ◽  
Vol 26 (8) ◽  
pp. 662-671 ◽  
Author(s):  
Marie Tarrant ◽  
Aimee Knierim ◽  
Sasha K. Hayes ◽  
James Ware

2014 ◽  
Vol 6 (4) ◽  
pp. 709-714 ◽  
Author(s):  
Brian Sanjay Heist ◽  
Jed David Gonzalo ◽  
Steven Durning ◽  
Dario Torre ◽  
David Michael Elnicki

Abstract Background Clinical vignette multiple-choice questions (MCQs) are widely used in medical education, but clinical reasoning (CR) strategies employed when approaching these questions have not been well described. Objectives The aims of the study were (1) to identify CR strategies and test-taking (TT) behaviors of physician trainees while solving clinical vignette MCQs; and (2) to examine the relationships between CR strategies and behaviors, and performance on a high-stakes clinical vignette MCQ examination. Methods Thirteen postgraduate year–1 level trainees completed 6 clinical vignette MCQs using a think-aloud protocol. Thematic analysis employing elements of grounded theory was performed on data transcriptions to identify CR strategies and TT behaviors. Participants' CR strategies and TT behaviors were then compared with their US Medical Licensing Examination Step 2 Clinical Knowledge scores. Results Twelve CR strategies and TT behaviors were identified. Individuals with low performance on Step 2 Clinical Knowledge demonstrated increased premature closure and increased faulty knowledge, and showed comparatively less ruling out of alternatives or admission of knowledge deficits. High performers on Step 2 Clinical Knowledge demonstrated increased ruling out of alternatives and admission of knowledge deficits, and less premature closure, faulty knowledge, or closure prior to reading the alternatives. Conclusions Different patterns of CR strategies and TT behaviors may be used by high and low performers during high-stakes clinical vignette MCQ examinations.


Author(s):  
Talip Karanfil ◽  
Steve Neufeld

High-stakes and high-volume English language proficiency tests typically rely on multiple-choice questions (MCQs) to assess reading and listening skills. Due to the Covid-19 pandemic, more institutions are using MCQs via online assessment platforms, which facilitate shuffling the order of options within test items to minimize cheating. There is scant research on the role that order and sequence of options plays in MCQs, so this study examined the results of a paper-based, high-stakes English proficiency test administered in two versions. Each version had identical three-option MCQs but with different ordering of options. The test-takers were chosen to ensure a very similar profile of language ability and level for the groups who took the two versions. The findings indicate that one in four questions exhibited significantly different levels of difficulty and discrimination between the two versions. The study identifies order dominance and sequence priming as two factors that influence the outcomes of MCQs, both of which can accentuate or diminish the power of attraction of the correct and incorrect options. These factors should be carefully considered when designing MCQs in high-stakes language proficiency tests and shuffling of options in either paper-based or computer-based testing.


2005 ◽  
Vol 39 (9) ◽  
pp. 890-894 ◽  
Author(s):  
Martin R Fischer ◽  
Sibyl Herrmann ◽  
Veronika Kopp

2018 ◽  
Vol 18 (1) ◽  
pp. 68 ◽  
Author(s):  
Deena Kheyami ◽  
Ahmed Jaradat ◽  
Tareq Al-Shibani ◽  
Fuad A. Ali

Objectives: The current study aimed to carry out a post-validation item analysis of multiple choice questions (MCQs) in medical examinations in order to evaluate correlations between item difficulty, item discrimination and distraction effectiveness so as to determine whether questions should be included, modified or discarded. In addition, the optimal number of options per MCQ was analysed. Methods: This cross-sectional study was performed in the Department of Paediatrics, Arabian Gulf University, Manama, Bahrain. A total of 800 MCQs and 4,000 distractors were analysed between November 2013 and June 2016. Results: The mean difficulty index ranged from 36.70–73.14%. The mean discrimination index ranged from 0.20–0.34. The mean distractor efficiency ranged from 66.50–90.00%. Of the items, 48.4%, 35.3%, 11.4%, 3.9% and 1.1% had zero, one, two, three and four nonfunctional distractors (NFDs), respectively. Using three or four rather than five options in each MCQ resulted in 95% or 83.6% of items having zero NFDs, respectively. The distractor efficiency was 91.87%, 85.83% and 64.13% for difficult, acceptable and easy items, respectively (P <0.005). Distractor efficiency was 83.33%, 83.24% and 77.56% for items with excellent, acceptable and poor discrimination, respectively (P <0.005). The average Kuder-Richardson formula 20 reliability coefficient was 0.76. Conclusion: A considerable number of the MCQ items were within acceptable ranges. However, some items needed to be discarded or revised. Using three or four rather than five options in MCQs is recommended to reduce the number of NFDs and improve the overall quality of the examination.


2006 ◽  
Vol 6 (6) ◽  
pp. 354-363 ◽  
Author(s):  
Marie Tarrant ◽  
Aimee Knierim ◽  
Sasha K. Hayes ◽  
James Ware

Sign in / Sign up

Export Citation Format

Share Document