scholarly journals Program Director Perceptions of Proficiency in the Core Entrustable Professional Activities

2017 ◽  
Vol 9 (5) ◽  
pp. 588-592 ◽  
Author(s):  
R. Ellen Pearlman ◽  
Melissa Pawelczak ◽  
Andrew C. Yacht ◽  
Salaahuddin Akbar ◽  
Gino A. Farina

ABSTRACT Background  The Association of American Medical Colleges describes 13 core entrustable professional activities (EPAs) that every graduating medical student should be expected to perform proficiently on day 1 of residency, regardless of chosen specialty. Studies have shown wide variability in program director (PD) confidence in interns' abilities to perform these core EPAs. Little is known regarding comparison of United States Medical Licensing Examination (USMLE) scores with proficiency in EPAs. Objective  We determined if PDs from a large health system felt confident in their postgraduate year 1 residents' abilities to perform the 13 core EPAs, and compared perceived EPA proficiency with USMLE Step 1 and Step 2 scores. Methods  The PDs were asked to rate their residents' proficiency in each EPA and to provide residents' USMLE scores. Timing coincided with the reporting period for resident milestones. Results  Surveys were completed on 204 of 328 residents (62%). PDs reported that 69% of residents (140 of 204) were prepared for EPA 4 (orders/prescriptions), 61% (117 of 192) for EPA 7 (form clinical questions), 68% (135 of 198) for EPA 8 (handovers), 63% (116 of 185) for EPA 11 (consent), and 38% (49 of 129) for EPA 13 (patient safety). EPA ratings and USMLE 1 and 2 were negatively correlated (r(101) = −0.23, P = .031). Conclusions  PDs felt that a significant percentage of residents were not adequately prepared in order writing, forming clinical questions, handoffs, informed consent, and promoting a culture of patient safety. We found no positive association between USMLE scores and EPA ratings.

2018 ◽  
Vol 10 (3) ◽  
Author(s):  
Benjamin Valley ◽  
Christopher Camp ◽  
Brian Grawe

Admissions to orthopedic surgery is a highly competitive process. Traditionally measures such as United States Medical Licensing Examination (USMLE) Step 1, class rank, AOA status have been major determinants in the ranking process. However, these traditional objective measures show mixed correlation to clinical success in orthopedic surgery residency. There have been several studies on the cognitive factors and their correlation with success in residency. However, it is clear that residency requires more than objective cognition, emphasizing complex social interactions that are influenced by non-cognitive variables including personality, work ethic, etc. This review aims to summarize the current understanding of non-cognitive factors influencing performance in orthopaedic surgical residency.


2016 ◽  
Vol 8 (3) ◽  
pp. 358-363 ◽  
Author(s):  
Jeanne M. Sandella ◽  
John R. Gimpel ◽  
Larissa L. Smith ◽  
John R. Boulet

ABSTRACT  The Comprehensive Osteopathic Medical Licensing Examination (COMLEX-USA) and the United States Medical Licensing Examination (USMLE) are recognized by all state medical licensing boards in the United States, and the Federation of State Medical Boards has supported the validity of both examinations for medical licensure. Many osteopathic medical students take both examinations.Background  The purpose of this study was to investigate performance on COMLEX-USA Level 1 and USMLE Step 1 of students from colleges of osteopathic medicine where the majority of students took both examinations.Objective  Data were collected on the entering classes of 2010 and 2011. Relationships between the COMLEX-USA Level 1 and the USMLE Step 1 were quantified using Pearson correlations. The correlation between outcomes on the 2 examinations was evaluated using the phi coefficient. A contingency table was constructed to look at first-attempt outcomes (pass/fail).Methods  Data for 2010 and 2011 were collected from 3 osteopathic medical schools, with 795 of 914 students (87%) taking both examinations. The correlation between first-attempt COMLEX-USA Level 1 and USMLE Step 1 scores was statistically significant across and within all 3 schools. The overall correlation was r(795) = 0.84 (P < .001). Pass/fail status on the 2 examinations was moderately correlated (ϕ = 0.39, P < .01).Results  Our study found a strong association between COMLEX Level 1 and USMLE Step 1 performance. Additional studies to accurately compare scores on these examinations are warranted.Conclusions


10.2196/20182 ◽  
2020 ◽  
Vol 6 (2) ◽  
pp. e20182
Author(s):  
Benjamin Liu

In recent years, US medical students have been increasingly absent from medical school classrooms. They do so to maximize their competitiveness for a good residency program, by achieving high scores on the United States Medical Licensing Examination (USMLE) Step 1. As a US medical student, I know that most of these class-skipping students are utilizing external learning resources, which are perceived to be more efficient than traditional lectures. Now that the USMLE Step 1 is adopting a pass/fail grading system, it may be tempting to expect students to return to traditional basic science lectures. Unfortunately, my experiences tell me this will not happen. Instead, US medical schools must adapt their curricula. These new curricula should focus on clinical decision making, team-based learning, and new medical decision technologies, while leveraging the validated ability of these external resources to teach the basic sciences. In doing so, faculty will not only increase student engagement but also modernize the curricula to meet new standards on effective medical learning.


2020 ◽  
Vol 12 (02) ◽  
pp. e251-e254
Author(s):  
Saif A. Hamdan ◽  
Alan T. Makhoul ◽  
Brian C. Drolet ◽  
Jennifer L. Lindsey ◽  
Janice C. Law

Abstract Background Scoring for the United States Medical Licensing Examination (USMLE) Step 1 was recently announced to be reported as binary as early as 2022. The general perception among program directors (PDs) in all specialties has largely been negative, but the perspective within ophthalmology remains uncharacterized. Objective This article characterizes ophthalmology residency PDs' perspectives regarding the impact of pass/fail USMLE Step 1 scoring on the residency application process. Methods A validated 19-item anonymous survey was electronically distributed to 111 PDs of Accreditation Council for Graduate Medical Education-accredited ophthalmology training programs. Results Fifty-six PDs (50.5%) completed the survey. The median age of respondents was 48 years and the majority were male (71.4%); the average tenure as PD was 7.1 years. Only 6 (10.7%) PDs reported the change of the USMLE Step 1 to pass/fail was a good idea. Most PDs (92.9%) indicated that this will make it more difficult to objectively compare applicants, and many (69.6%) did not agree that the change would improve medical student well-being. The majority (82.1%) indicated that there will be an increased emphasis on Step 2 Clinical Knowledge (CK) scores, and many (70.4%) felt that medical school reputation will be more important in application decisions. Conclusion Most ophthalmology PDs who responded to the survey do not support binary Step 1 scoring. Many raised concerns regarding shifted overemphasis on Step 2 CK, uncertain impact on student well-being, and potential to disadvantage certain groups of medical students including international medical graduates. These concerns highlight the need for reform in the ophthalmology application process.


2020 ◽  
Author(s):  
Benjamin Liu

UNSTRUCTURED In recent years, US medical students have been increasingly absent from medical school classrooms. They do so to maximize their competitiveness for a good residency program, by achieving high scores on the United States Medical Licensing Examination (USMLE) Step 1. As a US medical student, I know that most of these class-skipping students are utilizing external learning resources, which are perceived to be more efficient than traditional lectures. Now that the USMLE Step 1 is adopting a pass/fail grading system, it may be tempting to expect students to return to traditional basic science lectures. Unfortunately, my experiences tell me this will not happen. Instead, US medical schools must adapt their curricula. These new curricula should focus on clinical decision making, team-based learning, and new medical decision technologies, while leveraging the validated ability of these external resources to teach the basic sciences. In doing so, faculty will not only increase student engagement but also modernize the curricula to meet new standards on effective medical learning.


2012 ◽  
Vol 102 (6) ◽  
pp. 517-528 ◽  
Author(s):  
Anthony V. D’Antoni ◽  
Anthony C. DiLandro ◽  
Eileen D. Chusid ◽  
Michael J. Trepal

Background: In 2010, the New York College of Podiatric Medicine general anatomy course was redesigned to emphasize clinical anatomy. Over a 2-year period, United States Medical Licensing Examination (USMLE)–style items were used in lecture assessments with two cohorts of students (N =200). Items were single-best-answer and extended-matching formats. Psychometric properties of items and assessments were evaluated, and anonymous student post-course surveys were administered. Methods: Mean grades for each assessment were recorded over time and compared between cohorts using analysis of variance. Correlational analyses were used to investigate the relationship between final course grades and lecture examinations. Post-course survey response rates for the cohorts were 71 of 97 (73%) and 81 of 103 (79%). Results: The USMLE-style items had strong psychometric properties. Point biserial correlations were 0.20 and greater, and the range of students answering the items correctly was 25% to 75%. Examinations were highly reliable, with Kuder-Richardson 20 coefficients of 0.71 to 0.76. Students (>80%) reported that single-best-answer items were easier than extended-matching items. Students (>76%) believed that the items on the quizzes/examinations were similar to those found on USMLE Step 1. Most students (>84%) believed that they would do well on the anatomy section of their boards (American Podiatric Medical Licensing Examination [APMLE] Part I). Conclusions: Students valued USMLE-style items. These data, coupled with the psychometric data, suggest that USMLE-style items can be successfully incorporated into a basic science course in podiatric medical education. Outcomes from students who recently took the APMLE Part I suggest that incorporation of USMLE-style items into the general anatomy course was a successful measure and prepared them well. (J Am Podiatr Med Assoc 102(6): 517–528, 2012)


2018 ◽  
Vol 66 (3) ◽  
pp. 237-243 ◽  
Author(s):  
Laura M. Wagner ◽  
Mary A. Dolansky ◽  
Robert Englander

2015 ◽  
Vol 7 (4) ◽  
pp. 610-616 ◽  
Author(s):  
Mei Liang ◽  
Laurie S. Curtin ◽  
Mona M. Signer ◽  
Maria C. Savoia

ABSTRACT Background  Over the past decade, the number of unfilled positions in the National Resident Matching Program (NRMP) Main Residency Match has declined by one-third, while the number of unmatched applicants has grown by more than 50%, largely due to a rise in the number of international medical school students and graduates (IMGs). Although only half of IMG participants historically have matched to a first-year position, the Match experiences of unmatched IMGs have not been studied. Objective  We examined differences in interview and ranking behaviors between matched and unmatched IMGs participating in the 2013 Match and explored strategic errors made by unmatched IMGs when creating rank order lists. Methods  Rank order lists of IMGs who failed to match were analyzed in conjunction with their United States Medical Licensing Examination (USMLE) Step 1 scores and responses on the 2013 NRMP Applicant Survey. IMGs were categorized as “strong,” “solid,” “marginal,” or “weak” based on the perceived competitiveness of their USMLE Step 1 scores compared to other IMG applicants who matched in the same specialty. We examined ranking preferences and strategies by Match outcome. Results  Most unmatched IMGs were categorized as “marginal” or “weak”. However, unmatched IMGs who were non-US citizens presented more competitive USMLE Step 1 scores compared to unmatched IMGs who were US citizens. Unmatched IMGs were more likely than matched IMGs to rank programs at which they did not interview and to rank programs based on their perceived likelihood of matching. Conclusions  The interview and ranking behaviors of IMGs can have far-reaching consequences on their Match experience and outcomes.


2014 ◽  
Vol 6 (2) ◽  
pp. 280-283 ◽  
Author(s):  
Albert S. Lee ◽  
Lynn Chang ◽  
Eric Feng ◽  
Scott Helf

Abstract Background The Comprehensive Osteopathic Medical Licensing Examination of the United States (COMLEX-USA) Level 1 and United States Medical Licensing Examination (USMLE) Step 1 scores are important factors in the selection process of medical students into US residency programs. Objectives The goals of this study were to investigate the correlation between the COMLEX-USA Level 1 and the USMLE Step 1 and to assess the accuracy of the existing formulas in predicting USMLE scores from COMLEX-USA scores. Methods A retrospective study of 1016 paired COMLEX-USA Level 1 and USMLE Step 1 scores was conducted. Formulas by Sarko et al and by Slocum and Louder were used to estimate USMLE Step 1 scores from COMLEX-USA Level 1 scores, and a paired t test between calculated USMLE Step 1 scores and actual USMLE Step 1 scores was performed. Results During 2006–2012, 1016 of 1440 students (71%) took both the USMLE Step 1 and the COMLEX-USA Level 1 tests in the College of Osteopathic Medicine of the Pacific. The USMLE Step 1 scores were higher than those predicted by Slocum and Louder and by Sarko et al by an average of 14.16 ± 11.69 (P < .001) and 7.80 ± 12.48 (P < .001), respectively. A Pearson coefficient of 0.83 was observed. Regression analysis yielded the following formula: USMLE Step 1  =  0.2392 × COMLEX-USA Level 1 + 82.563 (R2  =  0.69577). Conclusions The USMLE Step 1 scores, on average, were higher than those predicted by the formulas derived by Slocum and Louder and by Sarko et al. Residency program directors should use caution when using formulas to derive USMLE Step 1 scores from COMLEX-USA Level 1 scores.


Sign in / Sign up

Export Citation Format

Share Document