Pass/Fail Scoring of USMLE Step 1 and the Need for Residency Selection Reform

2020 ◽  
pp. 019459982095116
Author(s):  
Parsa P. Salehi ◽  
Babak Azizzadeh ◽  
Yan Ho Lee

The Federation of State Medical Boards and the National Board of Medical Examiners recently announced a change in the United States Medical Licensing Examination Step 1 scoring convention to take effect, at the earliest, on January 1, 2022. There are many reasons for this change, including decreasing medical student stress and incentivizing students to learn freely without solely focusing on Step 1 performance. The question remains how this will affect the future of the otolaryngology–head and neck surgery match. By eradicating Step 1 grades, other factors, such as research, may garner increased importance in the application process. Such a shift may discriminate against students from less well-known medical schools, international medical graduates, and students from low socioeconomic backgrounds, who have fewer academic resources and access to research. Residency programs should try to anticipate such unintended consequences of the change and work on solutions heading into 2022.

2017 ◽  
Vol 41 (1) ◽  
pp. 149-153 ◽  
Author(s):  
Steven A. Haist ◽  
Agata P. Butler ◽  
Miguel A. Paniagua

The aim of this review is to highlight recent and potential future enhancements to the United States Licensing Examination (USMLE) program. The USMLE program is co-owned by the National Board of Medical Examiners (NBME) and the Federation of State Medical Boards. The USMLE includes four examinations: Step 1, Step 2 Clinical Knowledge, Step 2 Clinical Skills, and Step 3; every graduate of Liaison Committee on Medical Education-accredited allopathic medical schools and all international medical graduates must pass this examination series to practice medicine in the United States. From 2006 to 2009, the program underwent an indepth review resulting in five accepted recommendations. These recommendations have been the primary driver for many of the recent enhancements, such as an increased emphasis on foundational science and changes in the clinical skills examination, including more advanced communication skills assessment. These recommendations will continue to inform future changes such as access to references (e.g., a map of metabolic pathways) or decision-making tools for use during the examination. The NBME also provides assessment services globally to medical schools, students, residency programs, and residents. In 2015, >550,000 assessments were provided through the subject examination program, NBME self-assessment services, and customized assessment services.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Brandon Craig ◽  
Xiaolin Wang ◽  
Jeanne Sandella ◽  
Tsung-Hsun Tsai ◽  
David Kuo ◽  
...  

Abstract Context The Comprehensive Osteopathic Medical Licensing Examination of the United States of America (COMLEX-USA) is a three level examination used as a pathway to licensure for students in osteopathic medical education programs. COMLEX-USA Level 2 includes a written assessment of Fundamental Clinical Sciences for Osteopathic Medical Practice (Level 2-Cognitive Evaluation [L2-CE]) delivered in a computer based format and separate performance evaluation (Level 2-Performance Evaluation [L2-PE]) administered through live encounters with standardized patients. L2-PE was designed to augment L2-CE. It is expected that the two examinations measure related yet distinct constructs. Objectives To explore the concurrent validity of L2-CE with L2-PE. Methods First attempt test scores were obtained from the National Board of Osteopathic Medical Examiners database for 6,639 candidates who took L2-CE between June 2019 and May 2020 and matched to the students’ L2-PE scores. The sample represented all colleges of osteopathic medicine and 97.5% of candidates who took L2-CE during the complete 2019–2020 test cycle. We calculated disattenuated correlations between the total score for L2-CE, the L2-CE scores for the seven competency domains (CD1 through CD7), and the L2-PE scores for the Humanistic Domain (HM) and Biomedical/Biomechanical Domain (BM). All scores were on continuous scales. Results Pearson correlations ranged from 0.10 to 0.88 and were all statically significant (p<0.01). L2-CE total score was most strongly correlated with CD2 (0.88) and CD3 (0.85). Pearson correlations between the L2-CE competency domain subscores ranged from 0.17 to 0.70, and correlations which included either HM or BM ranged from 0.10 to 0.34 with the strongest of those correlations being between BM and L2-CE total score (0.34) as well as between HM and BM (0.28).The largest increase between corresponding Pearson and disattenuated correlations was for pairs of scores with lower reliabilities such as CD5 and CD6, which had a Pearson correlation of 0.17 and a disattenuated correlation of 0.68. The smallest increase in correlations was observed in pairs of scores with larger reliabilities such as L2-CE total score and HM, which had a Pearson correlation of 0.23 and a disattenuated correlation of 0.28. The reliability of L2-CE was 0.87, 0.81 for HM, and 0.73 for BM. The reliabilities for the L2-CE competency domain scores ranged from 0.22 to 0.74. The small to moderate correlations between the L2-CE total score and the two L2-PE support the expectation that these examinations measure related but distinct constructs. The correlations between L2-PE and L2-CE competency domain subscores reflect the distribution of items defined by the L2-PE blueprint, providing evidence that the examinations are performing as designed. Conclusions This study provides evidence supporting the validity of the blueprints for constructing COMLEX-USA Levels 2-CE and 2-PE examinations in concert with the purpose and nature of the examinations.


2014 ◽  
Vol 100 (1) ◽  
pp. 9-14
Author(s):  
Frances E. Cain ◽  
Phil Davignon ◽  
Thomas R. Henzel ◽  
Andrea Ciccone ◽  
Aaron Young

ABSTRACT State medical boards have long recognized the importance of evaluating the ongoing knowledge and competence of licensed physicians under a variety of circumstances. Before granting or renewing a license, it may be necessary for state boards to evaluate physicians as part of a disciplinary process or following a period of inactivity for either disciplinary or non-disciplinary reasons. The Post-Licensure Assessment System (PLAS), a joint program of the Federation of State Medical Boards (FSMB) and the National Board of Medical Examiners (NBME), has played a role in assisting state boards with evaluating physicians' basic medical knowledge in all of these circumstances by providing the Special Purpose Examination (SPEX). While SPEX has been administered since 1988, there has not been a nationally published study summarizing the characteristics of physicians taking the exam and their examination pass rates. To address this, we examined physicians who took SPEX between 2003 and 2011, and the outcomes of their exams. Our research demonstrates that the majority of examinees take SPEX for non-disciplinary reasons, with those who take SPEX for disciplinary reasons having lower pass rates. Future research should focus on evaluating the ultimate outcomes for physicians taking SPEX, including the ability to attain and retain a license to practice medicine.


Author(s):  
Malcolm M MacFarlane

This paper explores the marginalization experienced by International Medical Graduates (IMGs) in the Canadian Residency Matching Service (CaRMS) Match. This marginalization occurs despite all IMGs being Canadian citizens or permanent residents, and having objectively demonstrated competence equivalent to that expected of a graduate of a Canadian medical School through examinations such as the MCCQE1 and the National Assessment Collaboration OSCE. This paper explores how the current CaRMS Match works, evidence of marginalization, and ethnicity and human rights implications of the current CaRMS system. A brief history of post graduate medical education and the residency selection process is provided along with a brief legal analysis of authority for making CaRMS eligibility decisions. Current CaRMS practices are situated in the context of Provincial fairness legislation, and rationalizations and rationales for the current CaRMS system are explored. The paper examines objective indicators of IMG competence, as well as relevant legislation regarding international credential recognition and labour mobility. The issues are placed in the context of current immigration and education policies and best practices. An international perspective is provided through comparison with the United States National Residency Matching Program. Suggestions are offered for changes to the current CaRMS system to bring the process more in line with legislation and current Canadian value systems, such that “A Canadian is a Canadian.”


2018 ◽  
Vol 7 (9) ◽  
pp. 31-36
Author(s):  
Raja Subhiyah

The National Board of Medical Examiners (NBME®) applies rigorous examinations to determine the impact on candidates taking the United States Medical License Examination (USMLE®) for the purpose of licensing physicians in the US. UU UU These standards apply to all levels of exam development, administration and qualification. The standards apply to the following processes:• Validity of the inferences of the articles: content: the content tested must be appropriate, asking the correct questions, the format of the element, or the design of the test and the design, or the process by which the test is performed, the documentation• Accuracy of the scores: reliability, focalization, information on the cutting edge, standard errors• Determination and application of cut points: methods, modified Angoff procedure, misclassification error.The main focus is on the last standard, although the first two levels are also briefly discussed. Different methods to establish an approval standard will be discussed and the method used for USMLE will be described. Errors of misclassification are also presented and how to minimize them.


2020 ◽  
Vol 12 (02) ◽  
pp. e251-e254
Author(s):  
Saif A. Hamdan ◽  
Alan T. Makhoul ◽  
Brian C. Drolet ◽  
Jennifer L. Lindsey ◽  
Janice C. Law

Abstract Background Scoring for the United States Medical Licensing Examination (USMLE) Step 1 was recently announced to be reported as binary as early as 2022. The general perception among program directors (PDs) in all specialties has largely been negative, but the perspective within ophthalmology remains uncharacterized. Objective This article characterizes ophthalmology residency PDs' perspectives regarding the impact of pass/fail USMLE Step 1 scoring on the residency application process. Methods A validated 19-item anonymous survey was electronically distributed to 111 PDs of Accreditation Council for Graduate Medical Education-accredited ophthalmology training programs. Results Fifty-six PDs (50.5%) completed the survey. The median age of respondents was 48 years and the majority were male (71.4%); the average tenure as PD was 7.1 years. Only 6 (10.7%) PDs reported the change of the USMLE Step 1 to pass/fail was a good idea. Most PDs (92.9%) indicated that this will make it more difficult to objectively compare applicants, and many (69.6%) did not agree that the change would improve medical student well-being. The majority (82.1%) indicated that there will be an increased emphasis on Step 2 Clinical Knowledge (CK) scores, and many (70.4%) felt that medical school reputation will be more important in application decisions. Conclusion Most ophthalmology PDs who responded to the survey do not support binary Step 1 scoring. Many raised concerns regarding shifted overemphasis on Step 2 CK, uncertain impact on student well-being, and potential to disadvantage certain groups of medical students including international medical graduates. These concerns highlight the need for reform in the ophthalmology application process.


PRiMER ◽  
2021 ◽  
Vol 5 ◽  
Author(s):  
Eron Drake ◽  
Julie P. Phillips ◽  
Iris Kovar-Gough

Introduction: The United States Medical Licensing Examination (USMLE) Step 1 will transition to a pass-fail format in 2022. This is likely to result in an increased focus on Step 2 Clinical Knowledge (CK) scores. Thus, academic advisors must provide evidence-based guidance for preparing students. While prior research has examined the utility of academic indicators to predict student performance on the USMLE exams, no significant scholarly effort has described or evaluated students' study approaches. The research study's goal was to understand what strategies and resources students utilized when preparing for the Step 2 CK exam and investigate the relationship(s) between these approaches and performance. Methods: Students at a single US medical school were surveyed about their Step 2 CK preparation. We analyzed self-reported exam preparation strategies and the use of specific resources to determine their relationship with Step 2 CK score. Results: Student performance on Step 2 CK was correlated with performance on previous exams, including school-specific examinations, National Board of Medical Examiners clerkship shelf exams, and Step 1. Two study strategies were positively correlated with Step 2 CK score in preliminary analyses: completing more working practice questions, and the proportion of a question bank completed. In hierarchical regression, only completing more working questions remained predictive, after controlling for demographic variables and Step 1 performance. Conclusions: Faculty and staff can optimize students' Step 2 CK performance by encouraging them to work through case-based, clinically-focused questions. Further study is needed to describe optimal preparation strategies better.


Sign in / Sign up

Export Citation Format

Share Document