scholarly journals Scoring multiple choice questions

Author(s):  
John J. Barnard

This article briefly touches on how different measurement theories can be used to score responses on multiple choice questions (MCQs). How missing data is treated may have a profound effect on a person’s score and is dealt with most elegantly in modern theories. The issue of guessing a correct answer has been a topic of discussion for many years. It is asserted that test takers almost never have no knowledge whatsoever of the content in an appropriate test and therefore tend to make educated guesses rather than random guesses. Problems related to the classical correction for guessing is highlighted and the Rasch approach to use fit statistics to identify possible guessing, is briefly discussed. The threeparameter ‘logistic’ item response theory (IRT) model includes a ‘guessing item parameter’ to indicate the chances that a test taker guessed the correct answer to an item. However, it is pointed out that it is a person that guesses, not an item, and therefore a guessing parameter should be a person parameter. Option probability theory (OPT) purports to overcome this problem through requiring an indication of the degree of certainty the test taker has that a particular option is the correct one. Realistic allocations of these probabilities indicate the degree of guessing and hence more precise measures of ability.

Author(s):  
Kelly Cline ◽  
Holly Zullo ◽  
David A Huckaby

Abstract Common student errors and misconceptions can be addressed through the method of classroom voting, in which the instructor presents a multiple-choice question to the class, and after a few minutes for consideration and small-group discussion, each student votes on the correct answer, using a clicker or a phone. If a large number of students have voted for one particular incorrect answer, the instructor can recognize and address the issue. In order to identify multiple-choice questions that are especially effective at provoking common errors and misconceptions, we recorded the percentages of students voting for each option on each question used in 25 sections of integral calculus, taught by 7 instructors, at 4 institutions, over the course of 12 years, on a collection of 172 questions. We restricted our analysis to the 115 questions which were voted on by at least 5 different classes. We present the six questions that caused the largest percentages of students to vote for a particular incorrect answer, discuss how we used these questions in the classroom, and examine the common features of these questions. Further, we look for correlations between question characteristics and the mean percentage of students voting for common errors on these questions, and we find that questions based on general cases have higher percentages of students voting for common errors.


2018 ◽  
Vol 7 (3.33) ◽  
pp. 257
Author(s):  
Jae-Young Lee ◽  
. .

In order to alleviate the burden for the time-consuming and tedious tasks to make multiple choice questions, we proposed the system that generates multiple choice questions from the sentence with multiple component keywords and then relocates the questions selected by an array with random numbers instead of random functions in order to reduce the relocation time, after the system searches for the group of informative sentence with multiple component keywords by using special idioms. In this paper, the idiom is the CRm type idiom that has several components at the right side of this idiom including in a main informative sentence. The next sentences consist of other informative sentences including the components keywords. To make multiple choice questions, the system randomly selects an informative sentence including a component keyword and it also converts the informative sentence into a question. The selected component keyword is used as the correct answer and the three other component keywords are used as distractors. To produce many different questions about the same contents with different positions of the question and items, the system uses a random number array to reduce the relocation time.  


1998 ◽  
Vol 25 (1) ◽  
pp. 31-35 ◽  
Author(s):  
Helen C. Harton ◽  
Laura R. Green ◽  
Craig Jackson ◽  
Bibb Latané

This demonstration illustrates principles of group dynamics and dynamic social impact and can be used in classes in social psychology or group dynamics. Students discuss their answers to multiple-choice questions with neighbors and answer them again. Discussion consistently leads to the consolidation (reduced diversity), clustering (spatial-self-organization), correlation (emergent linkages), and continuing diversity of responses. “Truth” does not necessarily win, showing that the social reality of the group may be more important than objective reality.


PEDIATRICS ◽  
1977 ◽  
Vol 59 (5) ◽  
pp. 788-788
Author(s):  
Howard C. Mofenson ◽  
Joseph Greensher

The Personal Assessment for Continuing Education (PACE), part I, appears to be living up to its expectations of providing a guide for areas that need reinforcement, clarification, and updating. See table in the PDFfile. Since its distribution in our area we have had a number of calls from colleagues inquiring about question 75 of part I, multiple-choice questions: "Treatment of acute kerosene poisoning should include which of the following?" The correct answer (E) includes "(4) Administration of ipecac." It came as a surprise to many practicing pediatricians that emesis was now being advised for a petroleum distillate ingestion.


Author(s):  
Muhibbatul Laili

The study of construct validity on summative test items in multiple-choice Arabic aims to improve the test items' quality to make them valid in the construction aspect. The study was carried out on multiple-choice items based on the modified construction criteria instrument based on the Guttman scale. Items that match the indicator are marked with S and TS if they do not match. Based on a review of 40 items, it was found that the items were not suitable, namely: 25 items with unformulated stem clearly and firmly, two questions that pointed to the correct answer, and one question with pictures that confused. Meanwhile, other items are in accordance with all construct criteria indicators for multiple-choice questions. Based on the research results, the question maker must correct and pay attention to these aspects to produce a valid test kit.


2018 ◽  
Vol 2018 ◽  
pp. 1-12 ◽  
Author(s):  
Andreas Melzer ◽  
Ulrich Gergs ◽  
Josef Lukas ◽  
Joachim Neumann

Multiple-choice questions are widely used in clinical education. Usually, the students have to mark the one and only correct answer from a set of five alternatives. Here, in a voluntary exam, at the end of an obligatory pharmacology exam, we tested a format where more than one alternative could be correct (N=544 students from three year groups). Moreover, the students were asked to rate each item. The students were unaware how many correct answers were contained in the questions. Finally, a questionnaire had to be filled out about the difficulty of the new tests compared to the one out of five tests. In the obligatory final exam, all groups performed similarly. From the results, we conclude that the new rating scales were a better challenge and could be adapted to assess student knowledge and confidence in more depth than previous multiple-choice questions.


Author(s):  
Reggie Kwan ◽  
Kenneth Wong ◽  
Philip Tsang ◽  
Francis Yu

Using virtual and physical resources to enhance learning and teaching is the cornerstone of Hybrid Learning. This chapter deals with how an online assessment system, as part of a hybrid learning initiative, can be used for learning and not just assessment. A system has been built based on the Item Response Theory (IRT) model. The system helps teachers to gauge the competency level of each individual student and at the same time provides students with feedback and an individualized study path right after a sequence of multiple choice questions attempted by each student.


Informatica ◽  
2017 ◽  
Vol 28 (4) ◽  
pp. 609-628
Author(s):  
Ali Fahmi ◽  
Cengiz Kahraman

Sign in / Sign up

Export Citation Format

Share Document