scholarly journals Multiple Choice Questions in Medical Education: How to Construct High Quality Questions

Author(s):  
Abdus Salam ◽  
Rabeya Yousuf ◽  
Sheikh Muhammad Abu Bakar

Multiple choice questions (MCQ) are the most widely used objective test items. Students often learn what we assess, and not what we teach, although teaching and assessment are the two sides of the same coin. So, assessment in medical education is very important to ensure that qualified competent doctors are being produced.A good test is the test that assesses higher level of thinking skills. Many inhouse MCQs are found faulty which assess lower level of thinking skills. The main problems in constructing good MCQs are that (i) very few faculty members have formal training in questions construction, (ii) most of the questions are prepared in the last minutes where little time exist for vetting to review the quality of questions and (iii) lack of promise on the standard of the question format and underestimation of the use of blueprint in medical schools. Constructing good MCQs, emphasis should be given that, the stem is meaningful and present a definite problem, it contains only relevant material and avoid negativity. It should be ensuring that, all options present as plausible, clear and concise, mutually exclusive, logical in order, free from clues and avoid ‘all of the above’ and ‘none of the above’. The MCQs can tests well any higher level of the cognitive domain, if it is constructed well. Efforts must be made to prepare and use of test blueprint as a guide to construct good MCQs. This paper describes and offers medical teachers a window to a comprehensive understanding of different types and aspects of MCQs and how to construct test blueprint and good MCQs that tests higher order thinking skills in the future medical graduates, thereby ensures competent doctors are being produced.International Journal of Human and Health Sciences Vol. 04 No. 02 April’20 Page : 79-88

Author(s):  
J. K. Stringer ◽  
Sally A. Santen ◽  
Eun Lee ◽  
Meagan Rawls ◽  
Jean Bailey ◽  
...  

Abstract Background Analytic thinking skills are important to the development of physicians. Therefore, educators and licensing boards utilize multiple-choice questions (MCQs) to assess these knowledge and skills. MCQs are written under two assumptions: that they can be written as higher or lower order according to Bloom’s taxonomy, and students will perceive questions to be the same taxonomical level as intended. This study seeks to understand the students’ approach to questions by analyzing differences in students’ perception of the Bloom’s level of MCQs in relation to their knowledge and confidence. Methods A total of 137 students responded to practice endocrine MCQs. Participants indicated the answer to the question, their interpretation of it as higher or lower order, and the degree of confidence in their response to the question. Results Although there was no significant association between students’ average performance on the content and their question classification (higher or lower), individual students who were less confident in their answer were more than five times as likely (OR = 5.49) to identify a question as higher order than their more confident peers. Students who responded incorrectly to the MCQ were 4 times as likely to identify a question as higher order than their peers who responded correctly. Conclusions The results suggest that higher performing, more confident students rely on identifying patterns (even if the question was intended to be higher order). In contrast, less confident students engage in higher-order, analytic thinking even if the question is intended to be lower order. Better understanding of the processes through which students interpret MCQs will help us to better understand the development of clinical reasoning skills.


Pythagoras ◽  
2009 ◽  
Vol 0 (69) ◽  
Author(s):  
Belinda Huntley ◽  
Johann Engelbrecht ◽  
Ansie Harding

In this study we propose a taxonomy for assessment in mathematics, which we call the assessment component taxonomy, to identify those components of mathematics that can be successfully assessed using alternative assessment formats. Based on the literature on assessment models and taxonomies in mathematics, this taxonomy consists of seven mathematics assessment components, hierarchically ordered by cognitive level, as well as the nature of the mathematical tasks associated with each component. Using a model that we developed earlier for measuring the quality of mathematics test items, we investigate which of the assessment components can be successfully assessed in the provided response question (PRQ) format, in particular multiple choice questions (MCQs), and which can be better assessed in the constructed response question (CRQ) format. The results of this study show that MCQs can be constructed to evaluate higher order levels of thinking and learning. The conclusion is that MCQs can be successfully used as an assessment format in undergraduate mathematics, more so in some assessment components than in others. The inclusion of the PRQ assessment format in all seven assessment components can reduce the large marking loads, associated with continuous assessment practices in undergraduate mathematics, without compromising the validity of the assessment.


2018 ◽  
Vol 4 (2) ◽  
pp. 208
Author(s):  
Muhammad Erfan ◽  
Tursina Ratu

Higher Order Thinking Skills (HOTS) are essential skills for prospective teachers in the 21st  century. HOTS in the cognitive domain includes the ability in analyzing (C4), evaluating (C5), and creating (C6). In the process of mastering HOTS, one must know first what level of thinking skills he/she has. Therefore, this study aims to measure the achievement of the cognitive thinking skills of students of the Physics Education Study Program, Faculty of Teacher Training and Education University of Samawa. The measurement used test instrument in the form of essay test, then the results of the thinking skills achievement for each cognitive domain of students divided into three categories (low, medium and high). The result of  the student’s achievement of the cognitive thinking skills  obtained by students were in low category of 55%, the medium category of 11%, and 34% in the high category for mastering lower-order thinking skills (LOTS), while for HOTS, 100% was in low category and it can be concluded that the achievement of students' cognitive thinking skills is still in lower-order thinking skills.


2021 ◽  
pp. 160-171
Author(s):  
Iryna Lenchuk ◽  
Amer Ahmed

This article describes the results of Action Research conducted in an ESP classroom of Dhofar University located in Oman. Following the call of Oman Vision 2040 to emphasize educational practices that promote the development of higher-order cognitive processes, this study raises the following question: Can an online multiple choice question (MCQ) quiz tap into the higher-order cognitive skills of apply, analyze and evaluate? This question was also critical at the time of the COVID-19 pandemic when Omani universities switched to the online learning mode. The researchers administered an online MCQ quiz to 35 undergraduate students enrolled in an ESP course for Engineering and Sciences. The results showed that MCQ quizzes could be developed to tap into higher-order thinking skills when the stem of the MSQ is developed as a task or a scenario. The study also revealed that students performed better on MCQs that tap into low-level cognitive skills. This result can be attributed to the prevalent practice in Oman to develop assessment tools that tap only into a level of Bloom’s taxonomy, which involves the cognitive process of retrieving memorized information. The significance of the study lies in its pedagogical applications. The study calls for the use of teaching and assessment practices that target the development of higher-order thinking skills, which is aligned with the country’s strategic direction reflected in Oman vision 2040.


2016 ◽  
Vol 7 (2) ◽  
pp. 44
Author(s):  
Kurnia Ningsih

This research aims to describe MIPA teachers’ ability to design knowledge assessment through the analysis of achievement aspects of knowledge assessment. This research used a descriptive method with SMP MIPA teachers in Pontianak City who have taught for more than 5 years and have an undergraduate degree as the population. The samples in this research, selected using a purposive sampling technique, consisted of 12 teachers who submitted MIPA test items. The research instrument used the data of the test item document designed by the teachers in the form of a multiple-choice test. The data were analyzed descriptively which included data reduction, systematic data display, and conclusion. The results showed that of the 12 test instruments made by with 380 questions in total, the teachers’ ability to design knowledge assessment (Multiple Choice Questions) obtained 17.37% of knowledge aspect, 67.90% of understanding aspect, 8.68% of implementation aspect, and 6.05% of analysis aspect. There were no questions made related to evaluation and creation aspects. Keywords: teachers ability, designing knowledge assessment.


Sign in / Sign up

Export Citation Format

Share Document