scholarly journals Application of a utility analysis to evaluate a novel assessment tool for clinically oriented physiology and pharmacology

2016 ◽  
Vol 40 (3) ◽  
pp. 304-312 ◽  
Author(s):  
Nicholas Cramer ◽  
Abdo Asmar ◽  
Laurel Gorman ◽  
Bernard Gros ◽  
David Harris ◽  
...  

Multiple-choice questions are a gold-standard tool in medical school for assessment of knowledge and are the mainstay of licensing examinations. However, multiple-choice questions items can be criticized for lacking the ability to test higher-order learning or integrative thinking across multiple disciplines. Our objective was to develop a novel assessment that would address understanding of pathophysiology and pharmacology, evaluate learning at the levels of application, evaluation and synthesis, and allow students to demonstrate clinical reasoning. The rubric assesses student writeups of clinical case problems. The method is based on the physician's traditional postencounter Subjective, Objective, Assessment and Plan note. Students were required to correctly identify subjective and objective findings in authentic clinical case problems, to ascribe pathophysiological as well as pharmacological mechanisms to these findings, and to justify a list of differential diagnoses. A utility analysis was undertaken to evaluate the new assessment tool by appraising its reliability, validity, feasibility, cost effectiveness, acceptability, and educational impact using a mixed-method approach. The Subjective, Objective, Assessment and Plan assessment tool scored highly in terms of validity and educational impact and had acceptable levels of statistical reliability but was limited in terms of acceptance, feasibility, and cost effectiveness due to high time demands on expert graders and workload concerns from students. We conclude by making suggestions for improving the tool and recommend deployment of the instrument for low-stakes summative assessment or formative assessment.

Author(s):  
Sri G. Thrumurthy ◽  
Tania Samantha De Silva ◽  
Zia Moinuddin ◽  
Stuart Enoch

Specifically designed to help candidates revise for the MRCS exam, this book features 350 Single Best Answer multiple choice questions, covering the whole syllabus. Containing everything candidates need to pass the MRCS Part A SBA section of the exam, it focuses intensively on the application of basic sciences (applied surgical anatomy, physiology, and pathology) to the management of surgical patients. The high level of detail included within the questions and their explanations allows effective self-assessment of knowledge and quick identification of key areas requiring further attention. Varying approaches to Single Best Answer multiple choice questions are used, giving effective exam practice and guidance through revision and exam technique. This includes clinical case questions, 'positively-worded' questions, requiring selection of the most appropriate of relatively correct answers; 'two-step' or 'double-jump' questions, requiring several cognitive steps to arrive at the correct answer; as well as 'factual recall' questions, prompting basic recall of facts.


Author(s):  
Ademir Garcia Reberti ◽  
Nayme Hechem Monfredini ◽  
Olavo Franco Ferreira Filho ◽  
Dalton Francisco de Andrade ◽  
Carlos Eduardo Andrade Pinheiro ◽  
...  

Abstract: Progress Test is an objective assessment, consisting of 60 to 150 multiple-choice questions, designed to promote an assessment of the cognitive skills expected at the end of undergraduate school. This test is applied to all students on the same day, so that it is possible to compare the results between grades and analyze the development of knowledge performance throughout the course. This study aimed to carry out a systematic and literary review about Progress Test in medical schools in Brazil and around the world, understanding the benefits of its implementation for the development of learning for the student, the teacher and the institution. The study was carried out from July 2018 to April 2019, which addressed articles published from January 2002 to March 2019. The keywords used were: “Progress Test in Medical Schools” and “Item Response Theory in Medicine” in the PubMed, Scielo, and Lilacs platforms. There was no language limitation in article selection, but the research was carried out in English. A total of 192,026 articles were identified, and after applying advanced search filters, 11 articles were included in the study. The Progress Test (PTMed) has been applied in medical schools, either alone or in groups of partner schools, since the late 1990s. The test results build the students’ performance curves, which allow us to identify weaknesses and strengths of the students in the several areas of knowledge related to the course. The Progress Test is not an exclusive instrument for assessing student performance, but it is also important as an assessment tool for academic management use and thus, it is crucial that institutions take an active role in the preparation and analysis of this assessment data. Assessments designed to test clinical competence in medical students need to be valid and reliable. For the evaluative method to be valid it is necessary that the subject be extensively reviewed and studied, aiming at improvements and adjustments in test performance.


2019 ◽  
Vol 5 (1) ◽  
pp. e000495
Author(s):  
Danielle L Cummings ◽  
Matthew Smith ◽  
Brian Merrigan ◽  
Jeffrey Leggit

BackgroundMusculoskeletal (MSK) complaints comprise a large proportion of outpatient visits. However, multiple studies show that medical school curriculum often fails to adequately prepare graduates to diagnose and manage common MSK problems. Current standardised exams inadequately assess trainees’ MSK knowledge and other MSK-specific exams such as Freedman and Bernstein’s (1998) exam have limitations in implementation. We propose a new 30-question multiple choice exam for graduating medical students and primary care residents. Results highlight individual deficiencies and identify areas for curriculum improvement.Methods/ResultsWe developed a bank of multiple choice questions based on 10 critical topics in MSK medicine. The questions were validated with subject-matter experts (SMEs) using a modified Delphi method to obtain consensus on the importance of each question. Based on the SME input, we compiled 30 questions in the assessment. Results of the large-scale pilot test (167 post-clerkship medical students) were an average score of 74 % (range 53% – 90 %, SD 7.8%). In addition, the tool contains detailed explanations and references were created for each question to allow an individual or group to review and enhance learning.SummaryThe proposed MSK30 exam evaluates clinically important topics and offers an assessment tool for clinical MSK knowledge of medical students and residents. It fills a gap in current curriculum and improves on previous MSK-specific assessments through better clinical relevance and consistent grading. Educators can use the results of the exam to guide curriculum development and individual education.


2012 ◽  
Vol 11 (1) ◽  
pp. 47-57 ◽  
Author(s):  
Joyce M. Parker ◽  
Charles W. Anderson ◽  
Merle Heidemann ◽  
John Merrill ◽  
Brett Merritt ◽  
...  

We present a diagnostic question cluster (DQC) that assesses undergraduates' thinking about photosynthesis. This assessment tool is not designed to identify individual misconceptions. Rather, it is focused on students' abilities to apply basic concepts about photosynthesis by reasoning with a coordinated set of practices based on a few scientific principles: conservation of matter, conservation of energy, and the hierarchical nature of biological systems. Data on students' responses to the cluster items and uses of some of the questions in multiple-choice, multiple-true/false, and essay formats are compared. A cross-over study indicates that the multiple-true/false format shows promise as a machine-gradable format that identifies students who have a mixture of accurate and inaccurate ideas. In addition, interviews with students about their choices on three multiple-choice questions reveal the fragility of students' understanding. Collectively, the data show that many undergraduates lack both a basic understanding of the role of photosynthesis in plant metabolism and the ability to reason with scientific principles when learning new content. Implications for instruction are discussed.


2021 ◽  
pp. 160-171
Author(s):  
Iryna Lenchuk ◽  
Amer Ahmed

This article describes the results of Action Research conducted in an ESP classroom of Dhofar University located in Oman. Following the call of Oman Vision 2040 to emphasize educational practices that promote the development of higher-order cognitive processes, this study raises the following question: Can an online multiple choice question (MCQ) quiz tap into the higher-order cognitive skills of apply, analyze and evaluate? This question was also critical at the time of the COVID-19 pandemic when Omani universities switched to the online learning mode. The researchers administered an online MCQ quiz to 35 undergraduate students enrolled in an ESP course for Engineering and Sciences. The results showed that MCQ quizzes could be developed to tap into higher-order thinking skills when the stem of the MSQ is developed as a task or a scenario. The study also revealed that students performed better on MCQs that tap into low-level cognitive skills. This result can be attributed to the prevalent practice in Oman to develop assessment tools that tap only into a level of Bloom’s taxonomy, which involves the cognitive process of retrieving memorized information. The significance of the study lies in its pedagogical applications. The study calls for the use of teaching and assessment practices that target the development of higher-order thinking skills, which is aligned with the country’s strategic direction reflected in Oman vision 2040.


Author(s):  
Dwi Milla Mufida ◽  
Dwi Astuti ◽  
Neli Purwani

L'évaluation est le processus de collecter des données pour savoir s’il y a  les objectifs éducatifs qui n'ont pas été atteints. Le test est un instrument d'évaluation dans l'éducation. En termes d’arrangement, le test est divisé en deux, ce sont des tests standardisés et des tests non standardisés. Les professeurs de français doivent savoir rédiger des tests en bonne qualité. Un test en bonne qualité est un test ayant une bonne structure à travers des instructions du test court et claire. Cette recherche a le but de décrire : 1) la qualité du test formatif programmé fait par les professeurs de français au lycée pour la classe X à Semarang de l’année academique 2016/2017, 2)  le type de la question à choix multiple et celui de la question réponse ouverte courte. Les données de cette recherche sont les questions les tests formatifs programmés faits par les professeurs de français aux lycées pour la classe X à Semarang. C’est une recherche descriptive qualitative. La méthode de recueil des données dans cette recherche est celle de documentation. Les questions à choix multiples sont analysées basées sur la matière, la construction, et la langue, afin que les questions des réponses ouvertes courtes sont analysées basées sur les règles d’arrangement du test en générales. Les résultats de cette recherche montrent que les qualités des questions des tests formatifs programmés faits par les professeurs au type de la question à choix multiple et la question de réponse ouverte courte aux lycées pour la classe X  à Semarang de l’année academique 2016/2017 sont dans le critère bien. Evaluation is the process of collecting data to know if there are educational goals that have not been achieved. The test is an assessment tool in education. In terms of arrangement, the test is divided into two, these are standardized tests and non-standardized tests. French teachers must know how to write tests in good quality. A test in good quality is a test with a good structure through short and clear test instructions. This research aims to describe: 1) the quality of the programmed formative test done by the French teachers in high school for the X class in Semarang of the academic year 2016/2017, 2) the type of the multiple choice question and that of the short open answer question. The data of this research are the questions programmed formative tests made by the teachers of French in high schools for the class X in Semarang. It is a qualitative descriptive search. The method of collecting data in this research is that of documentation. Multiple choice questions are analyzed based on subject matter, construction, and language, so that short open-ended questions are analyzed based on the general test arrangement rules. The results of this research show that the qualities of the questions of the programmed formative tests done by the teachers to the type of the multiple choice question and the question of short open answer to the high schools for the class X in Semarang of the academic year 2016/2017 are in the criterion well.


2021 ◽  
Vol 2 (1) ◽  
pp. 123-131
Author(s):  
Salha U. Amil

This study investigated the perception on Multiple Choice Questions: Its challenges and implications among Grade 12 Senior High Students of Mindanao State University-Sulu. Descriptive survey method was used in this study. Random sampling was used to select 100 students, 50 from GAS strand and 50 from STEM strand Grade 12 Senior high school students. The researcher prepared a survey questionnaire to obtain the necessary data. Mean was used to analyze the perceived challenges and implication while an independent T-test sample was used to test the hypotheses at alpha level of 0.05.The following were the highlight of the study: Multiple Choice Questions is a test format only used as an assessment tool during their Quarterly Examinations and it was revealed that MCQ is challenging foremost due to the need to answer critically under time pressure. Consequently, they need to manage their time in order to answer every subject. The challenge of MCQ also lies in the difficulty of the subject Also, the result revealed the implications of MCQ tends to urge students to answer each subject at limited time pressure students that they experience test anxiety. Meanwhile, there is a need to look into the level of difficulty of the subject matter which requires student rigorous preparation before exam. More so, MCQ develop student’s analysis on problem solving which provides them a real time experience of taking major exams and develop time management in studying and strategies in answering as it encourages them to answer with persistence and resilience. The study recommended that the school administration should encourage the applicability of this test format to other colleges in the institution that require licensure examination, Setting programs and intervention for the improvements of this endeavour  and Creating more research studies.


2021 ◽  
Vol 77 ◽  
pp. S85-S89
Author(s):  
Dharmendra Kumar ◽  
Raksha Jaipurkar ◽  
Atul Shekhar ◽  
Gaurav Sikri ◽  
V. Srinivas

2021 ◽  
Vol 2 (1) ◽  
pp. 48-56
Author(s):  
Yavuz Selim Kıyak ◽  
Işıl İrem Budakoğlu ◽  
Serdar Kula ◽  
Özlem Coşkun

This study introduces ContExtended Questions (CEQ), which is a tool both to teach and assess clinical reasoning particularly in the preclinical years, and the web-based program to implement. CEQ consists of text-based case-based multiple-choice questions that provide patient data in a fixed and predetermined sequence. It enables the examinees to develop and reshape their illness scripts by using feedback after every question. Feedback operates to transform the examinee’s failure into a “productive failure”. The preliminary results of the randomized controlled experiment of teaching clinical reasoning to preclinical students through CEQ is quite satisfactory. In the medical education literature, this would be the first time that students, who have no or very limited clinical experience, developed their illness scripts just by taking formative multiple-choice tests. The approach would be named “test-only learning”. The complete results of the experiment and then more experiments in other contexts and domains are necessary to establish a more powerful assessment tool and software. Furthermore, by changing the content of the questions, it is possible to use CEQ in every period of medical education and health professions education.


Sign in / Sign up

Export Citation Format

Share Document