scholarly journals Examining concurrent validity between COMLEX-USA Level 2-Cognitive Evaluation and COMLEX-USA Level 2-Performance Evaluation

2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Brandon Craig ◽  
Xiaolin Wang ◽  
Jeanne Sandella ◽  
Tsung-Hsun Tsai ◽  
David Kuo ◽  
...  

Abstract Context The Comprehensive Osteopathic Medical Licensing Examination of the United States of America (COMLEX-USA) is a three level examination used as a pathway to licensure for students in osteopathic medical education programs. COMLEX-USA Level 2 includes a written assessment of Fundamental Clinical Sciences for Osteopathic Medical Practice (Level 2-Cognitive Evaluation [L2-CE]) delivered in a computer based format and separate performance evaluation (Level 2-Performance Evaluation [L2-PE]) administered through live encounters with standardized patients. L2-PE was designed to augment L2-CE. It is expected that the two examinations measure related yet distinct constructs. Objectives To explore the concurrent validity of L2-CE with L2-PE. Methods First attempt test scores were obtained from the National Board of Osteopathic Medical Examiners database for 6,639 candidates who took L2-CE between June 2019 and May 2020 and matched to the students’ L2-PE scores. The sample represented all colleges of osteopathic medicine and 97.5% of candidates who took L2-CE during the complete 2019–2020 test cycle. We calculated disattenuated correlations between the total score for L2-CE, the L2-CE scores for the seven competency domains (CD1 through CD7), and the L2-PE scores for the Humanistic Domain (HM) and Biomedical/Biomechanical Domain (BM). All scores were on continuous scales. Results Pearson correlations ranged from 0.10 to 0.88 and were all statically significant (p<0.01). L2-CE total score was most strongly correlated with CD2 (0.88) and CD3 (0.85). Pearson correlations between the L2-CE competency domain subscores ranged from 0.17 to 0.70, and correlations which included either HM or BM ranged from 0.10 to 0.34 with the strongest of those correlations being between BM and L2-CE total score (0.34) as well as between HM and BM (0.28).The largest increase between corresponding Pearson and disattenuated correlations was for pairs of scores with lower reliabilities such as CD5 and CD6, which had a Pearson correlation of 0.17 and a disattenuated correlation of 0.68. The smallest increase in correlations was observed in pairs of scores with larger reliabilities such as L2-CE total score and HM, which had a Pearson correlation of 0.23 and a disattenuated correlation of 0.28. The reliability of L2-CE was 0.87, 0.81 for HM, and 0.73 for BM. The reliabilities for the L2-CE competency domain scores ranged from 0.22 to 0.74. The small to moderate correlations between the L2-CE total score and the two L2-PE support the expectation that these examinations measure related but distinct constructs. The correlations between L2-PE and L2-CE competency domain subscores reflect the distribution of items defined by the L2-PE blueprint, providing evidence that the examinations are performing as designed. Conclusions This study provides evidence supporting the validity of the blueprints for constructing COMLEX-USA Levels 2-CE and 2-PE examinations in concert with the purpose and nature of the examinations.

2018 ◽  
Vol 7 (9) ◽  
pp. 31-36
Author(s):  
Raja Subhiyah

The National Board of Medical Examiners (NBME®) applies rigorous examinations to determine the impact on candidates taking the United States Medical License Examination (USMLE®) for the purpose of licensing physicians in the US. UU UU These standards apply to all levels of exam development, administration and qualification. The standards apply to the following processes:• Validity of the inferences of the articles: content: the content tested must be appropriate, asking the correct questions, the format of the element, or the design of the test and the design, or the process by which the test is performed, the documentation• Accuracy of the scores: reliability, focalization, information on the cutting edge, standard errors• Determination and application of cut points: methods, modified Angoff procedure, misclassification error.The main focus is on the last standard, although the first two levels are also briefly discussed. Different methods to establish an approval standard will be discussed and the method used for USMLE will be described. Errors of misclassification are also presented and how to minimize them.


2020 ◽  
pp. 019459982095116
Author(s):  
Parsa P. Salehi ◽  
Babak Azizzadeh ◽  
Yan Ho Lee

The Federation of State Medical Boards and the National Board of Medical Examiners recently announced a change in the United States Medical Licensing Examination Step 1 scoring convention to take effect, at the earliest, on January 1, 2022. There are many reasons for this change, including decreasing medical student stress and incentivizing students to learn freely without solely focusing on Step 1 performance. The question remains how this will affect the future of the otolaryngology–head and neck surgery match. By eradicating Step 1 grades, other factors, such as research, may garner increased importance in the application process. Such a shift may discriminate against students from less well-known medical schools, international medical graduates, and students from low socioeconomic backgrounds, who have fewer academic resources and access to research. Residency programs should try to anticipate such unintended consequences of the change and work on solutions heading into 2022.


PRiMER ◽  
2021 ◽  
Vol 5 ◽  
Author(s):  
Eron Drake ◽  
Julie P. Phillips ◽  
Iris Kovar-Gough

Introduction: The United States Medical Licensing Examination (USMLE) Step 1 will transition to a pass-fail format in 2022. This is likely to result in an increased focus on Step 2 Clinical Knowledge (CK) scores. Thus, academic advisors must provide evidence-based guidance for preparing students. While prior research has examined the utility of academic indicators to predict student performance on the USMLE exams, no significant scholarly effort has described or evaluated students' study approaches. The research study's goal was to understand what strategies and resources students utilized when preparing for the Step 2 CK exam and investigate the relationship(s) between these approaches and performance. Methods: Students at a single US medical school were surveyed about their Step 2 CK preparation. We analyzed self-reported exam preparation strategies and the use of specific resources to determine their relationship with Step 2 CK score. Results: Student performance on Step 2 CK was correlated with performance on previous exams, including school-specific examinations, National Board of Medical Examiners clerkship shelf exams, and Step 1. Two study strategies were positively correlated with Step 2 CK score in preliminary analyses: completing more working practice questions, and the proportion of a question bank completed. In hierarchical regression, only completing more working questions remained predictive, after controlling for demographic variables and Step 1 performance. Conclusions: Faculty and staff can optimize students' Step 2 CK performance by encouraging them to work through case-based, clinically-focused questions. Further study is needed to describe optimal preparation strategies better.


2018 ◽  
Vol 104 (3) ◽  
pp. 11-18
Author(s):  
Dorothy T. Horber ◽  
John R. Gimpel

ABSTRACT To ensure the Comprehensive Osteopathic Medical Licensing Examination of the United States of America (COMLEX-USA) remains relevant and current in meeting the needs of the state licensing boards and other constituents, the National Board of Osteopathic Medical Examiners (NBOME) has developed a new blueprint for an enhanced, competency-based examination program to be implemented with the COMLEX-USA Level 3 examination in late 2018. This article summarizes the evidence-based design processes on which the new blueprint is built, how it differs from the previous blueprint, and the evidence supporting its validity for the primary and intended purpose of COMLEX-USA — osteopathic physician licensure. It concludes with the changes being implemented by the NBOME to ensure COMLEX-USA remains current and meets the needs of its stakeholders, the state licensing boards.


1993 ◽  
Vol 264 (6) ◽  
pp. S11 ◽  
Author(s):  
R G Carroll

Step 1 of the United States Medical Licensing Examination has now replaced the National Board of Medical Examiners part 1 examination as the initial step in medical licensure. The new examination format places an increased emphasis on clinical vignettes to test basic physiological concepts. Suggestions from a question-writing workshop were incorporated into cardiovascular items in a midcourse physiology examination. Item analysis indicated that the overall examination difficulty was not different from examinations administered in the preceding years. Within the examination, however, the vignette questions had significantly higher discrimination indexes than other question types. The slope of the difficulty-to-discrimination relationship was significantly higher in 1992 than in previous years. This means that questions of equivalent difficulty were more discriminating in 1992 than in previous years. Finally, this report suggests a method of identifying questions with both high discrimination and high difficulty index for the development of an (informal) test item bank. The clinical vignette question types are a valuable method of examining basic physiological principles.


2009 ◽  
Vol 95 (2) ◽  
pp. 22-29
Author(s):  
David B. Swanson ◽  
Katherine Z. Holtzman ◽  
David A. Johnson

ABSTRACT Developing test content for the USMLE involves significant efforts from physician volunteers and staff associated with the program. The bedrock of this process takes place among the test materials development committees (TMDCs) where physicians and content experts write multiple-choice questions for all three USMLE Steps. Ongoing assessment of the item pool by the respective Step Committees initiates item-writing assignments that bolster or maintain content in specific areas. Staff at the National Board of Medical Examiners (NBME) then assist item-writers to assure a consistent style and structure for all USMLE test items. All test materials are crafted to complement an overall examination blueprint. Multiple levels of review and pre-testing ensure that all test items making their way onto examination forms as live or ‘scored' material are appropriate, statistically sound and presented in test forms balanced to be consistent with the content outline and examination blueprint.


Sign in / Sign up

Export Citation Format

Share Document