CORRELATION OF DIFFICULTY INDEX AND DISCRIMINATING INDEX IN MEDICAL STUDENT’S ASSESSMENT.

2021 ◽  
pp. 9-10
Author(s):  
Bhoomika R. Chauhan ◽  
Jayesh Vaza ◽  
Girish R. Chauhan ◽  
Pradip R. Chauhan

Multiple choice questions are nowadays used in competitive examination and formative assessment to assess the student's eligibility and certification.Item analysis is the process of collecting,summarizing and using information from students' responses to assess the quality of test items.Goal of the study was to identify the relationship between the item difficulty index and item discriminating index in medical student's assessment. 400 final year medical students from various medical colleges responded 200 items constructed for the study.The responses were assessed and analysed for item difficulty index and item discriminating power. Item difficulty index an item discriminating power were analysed by statical methods to identify correlation.The discriminating power of the items with difficulty index in 40%-50% was the highest. Summary and Conclusion:Items with good difficulty index in range of 30%-70% are good discriminator.

Author(s):  
Suryakar Vrushali Prabhunath ◽  
Surekha T. Nemade ◽  
Ganesh D. Ghuge

Introduction: Multiple Choice Questions (MCQs) is one of the most preferred tool of assessment in medical education as a part of formative as well as summative assessment. MCQ performance as an assessment tool can be statistically analysed by Item analysis. Thus, aim of this study is to assess the quality of MCQs by item analysis and identify the valid test items to be included in the question bank for further use. Materials and methods: Formative assessment of Ist MBBS students was carried out with 40 MCQs as a part of internal examination in Biochemistry. Item analysis was done by calculating Difficulty index (P), Discrimination index (d) and number of Non- functional distractors. Results: Difficulty index (P) of 65% (26) items was well within acceptable range, 7.5% (3) items were too difficult whereas 27.5% (11) items were in the category of too easy. Discrimination Index (d) of 70% (28) items fell in recommended category whereas 10% (4) items were with acceptable, and 20% (8) were with poor Discrimination index. Out of 120 distractors 88.33% (106) were functional distractors and 11.66% (14) were non-functional distractors. After considering difficulty index, discrimination index and distractor effectiveness, 42.5% (17) items were found ideal to be included in the question bank. Conclusion: Item analysis remains an essential tool to be practiced regularly to improve the quality of the assessment methods as well as a tool for obtaining feedback for the instructors. Key Words: Difficulty index, Discrimination index, Item analysis, Multiple choice questions, Non-functional distractors


Author(s):  
Richa Garg ◽  
Vikas Kumar ◽  
Jyoti Maria

Background: Assessment is a dominant motivator to direct and drive students learning. Different methods of assessment are used to assess medical knowledge in undergraduate medical education. Multiple choice questions (MCQs) are being used increasingly due to their higher reliability, validity, and ease of scoring. Item analysis enables identifying good MCQs based on difficulty index (DIF I), discrimination index (DI), and distracter efficiency (DE).Methods: Students of second year MBBS appeared in a formative assessment test, that was comprised of 50 “One best response type” MCQs of 50 marks without negative marking. All MCQs were having single stem with four options including, one being correct answer and other three incorrect alternatives (distracter). Three question paper sets were prepared by disorganizing sequence of questions. One of the three paper sets was given to each student to avoid copying from neighboring students. Total 50 MCQs and 150 distracters were analyzed and indices like DIF I, DI, and DE were calculated.Results: Total Score of 87 students ranged from 17 to 48 (out of total 50). Mean for difficulty index (DIF I) (%) was 71.6+19.4. 28% MCQs were average and “recommended” (DIF I 30-70%). Mean for discrimination index (DI) was 0.3+0.17. 16% MCQs were “good” and 50% MCQs were in “excellent” criteria, while rests of the MCQs were “discard/poor” according to DI criteria. Mean for distracter efficiency (DE) (%) was 63.4+33.3. 90% of the items were having DE from 100 to 33%. It was found that MCQs with lower difficulty index (<70) were having higher distracter efficiency (93.8% vs. 6.2%, p=0.004).Conclusions: Item analysis provided necessary data for improvement in question formulation and helped in revising and improving the quality of items and test also. Questions having lower difficulty index (<70) were significantly associated with higher discrimination index (>0.15) and higher distractor efficiency.


2014 ◽  
Vol 2 (4) ◽  
pp. 148 ◽  
Author(s):  
HamzaMohammad Abdulghani ◽  
Farah Ahmad ◽  
Abdulmajeed Aldrees ◽  
MahmoudS Khalil ◽  
GomindaG Ponnamperuma

Author(s):  
Manju K. Nair ◽  
Dawnji S. R.

Background: Carefully constructed, high quality multiple choice questions can serve as effective tools to improve standard of teaching. This item analysis was performed to find the difficulty index, discrimination index and number of non functional distractors in single best response type questions.Methods: 40 single best response type questions with four options, each carrying one mark for the correct response, was taken for item analysis. There was no negative marking. The maximum marks was 40. Based on the scores, the evaluated answer scripts were arranged with the highest score on top and the least score at the bottom. Only the upper third and lower third were included. The response to each item was entered in Microsoft excel 2010. Difficulty index, Discrimination index and number of non functional distractors per item were calculated.Results: 40 multiple choice questions and 120 distractors were analysed in this study. 72.5% items were good with a difficulty index between 30%-70%. 25% items were difficult and 2.5% items were easy. 27.5% items showed excellent discrimination between high scoring and low scoring students. One item had a negative discrimination index (-0.1). There were 9 items with non functional distractors.Conclusions: This study emphasises the need for improving the quality of multiple choice questions. Hence repeated evaluation by item analysis and modification of non functional distractors may be performed to enhance standard of teaching in Pharmacology.


Author(s):  
Abhijeet S. Ingale ◽  
Purushottam A. Giri ◽  
Mohan K. Doibale

Background: Item analysis is the process of collecting, summarizing and using information from students’ response to assess the quality of test items. However it is said that MCQs emphasize recall of factual information rather than conceptual understanding and interpretation of concepts. There is more to writing good MCQs than writing good questions. The objectives of the study was to assess the item and test quality of multiple choice questions and to deal with the learning difficulties of students, identify the low achievers in the test. Methods: The hundred MBBS students from Government medical college were examined. A test comprising of thirty MCQs was administered. All items were analysed for Difficulty Index, Discrimination Index and Distractor Efficiency. Data entered in MS Excel 2007 and SPSS 21 analysed with statistical test of significance. Results: Majority 80% items difficulty index is within acceptable range. 63% items showed excellent discrimination Index. Distractor efficiency was overall satisfactory. Conclusions: Multiple choice questions with average difficulty and also having high discriminating power with good distracter efficiency should be incorporated into student’s examination. 


2020 ◽  
Vol 10 (2) ◽  
Author(s):  
Imtiaz Uddin ◽  
Iftikhar Uddin ◽  
Izaz Ur Rehman ◽  
Muhammad Siyar ◽  
Usman Mehboob

Background: MCQs type assessment in medical education is replacing old theory style. There are concerns regarding the quality of the Multiple Choice Questions.Objectives: To determine the quality of Multiple Choice Questions by item analysis. Material and Methods: Study was a cross sectional descriptive .Fifty Multiple Choice Questions in the final internal evaluation exams in 2015 of Pharmacology at Bacha khan Medical College were analyzed. The quality of each Multiple Choice Questions item was assessed by the Difficulty index (Dif.I), Discriminative Index (D.I) and Distracter Efficiency (D.E).Results: Multiple Choice Questions that were of moderate difficulty were 66%. Easy were 4% and high difficulty were 30%.Reasons for high difficult Multiple Choice Questions were analyzed as Item Writing Flaws 41%, Irreverent Difficulty 36% and C2 level 23%. Discrimination Index shows that majority of MCQs were of Excellent Level (DI greater than 0.25) i.e 52 , Good 32% . (DI=2.15-0.25), Poor 16%. MCQs Distracter Effectiveness (DE)= 4, 3,2,1 were 52%, 34%, 14%, and 0% respectively. Conclusion: Item analysis gives us different parameters with reasons to recheck MCQ pool and teaching programme. High proportions of difficult and sizable amount of poor discriminative indices MCQs were the finding in this study and need to be resolved


2018 ◽  
Vol 18 (1) ◽  
pp. 68 ◽  
Author(s):  
Deena Kheyami ◽  
Ahmed Jaradat ◽  
Tareq Al-Shibani ◽  
Fuad A. Ali

Objectives: The current study aimed to carry out a post-validation item analysis of multiple choice questions (MCQs) in medical examinations in order to evaluate correlations between item difficulty, item discrimination and distraction effectiveness so as to determine whether questions should be included, modified or discarded. In addition, the optimal number of options per MCQ was analysed. Methods: This cross-sectional study was performed in the Department of Paediatrics, Arabian Gulf University, Manama, Bahrain. A total of 800 MCQs and 4,000 distractors were analysed between November 2013 and June 2016. Results: The mean difficulty index ranged from 36.70–73.14%. The mean discrimination index ranged from 0.20–0.34. The mean distractor efficiency ranged from 66.50–90.00%. Of the items, 48.4%, 35.3%, 11.4%, 3.9% and 1.1% had zero, one, two, three and four nonfunctional distractors (NFDs), respectively. Using three or four rather than five options in each MCQ resulted in 95% or 83.6% of items having zero NFDs, respectively. The distractor efficiency was 91.87%, 85.83% and 64.13% for difficult, acceptable and easy items, respectively (P <0.005). Distractor efficiency was 83.33%, 83.24% and 77.56% for items with excellent, acceptable and poor discrimination, respectively (P <0.005). The average Kuder-Richardson formula 20 reliability coefficient was 0.76. Conclusion: A considerable number of the MCQ items were within acceptable ranges. However, some items needed to be discarded or revised. Using three or four rather than five options in MCQs is recommended to reduce the number of NFDs and improve the overall quality of the examination.


Sign in / Sign up

Export Citation Format

Share Document