A 5-Year Study of Emergency Medicine Intern-Objective Structured Clinical Examination (OSCE) Performance Does Not Correlate With Emergency Medicine Faculty Evaluation of Resident Performance

2013 ◽  
Vol 62 (5) ◽  
pp. S181 ◽  
Author(s):  
R. Shih ◽  
M. Silverman ◽  
C. Mayer
2013 ◽  
Vol 5 (4) ◽  
pp. 582-586 ◽  
Author(s):  
James G. Ryan ◽  
David Barlas ◽  
Simcha Pollack

Abstract Background Medical knowledge (MK) in residents is commonly assessed by the in-training examination (ITE) and faculty evaluations of resident performance. Objective We assessed the reliability of clinical evaluations of residents by faculty and the relationship between faculty assessments of resident performance and ITE scores. Methods We conducted a cross-sectional, observational study at an academic emergency department with a postgraduate year (PGY)-1 to PGY-3 emergency medicine residency program, comparing summative, quarterly, faculty evaluation data for MK and overall clinical competency (OC) with annual ITE scores, accounting for PGY level. We also assessed the reliability of faculty evaluations using a random effects, intraclass correlation analysis. Results We analyzed data for 59 emergency medicine residents during a 6-year period. Faculty evaluations of MK and OC were highly reliable (κ  =  0.99) and remained reliable after stratification by year of training (mean κ  =  0.68–0.84). Assessments of resident performance (MK and OC) and the ITE increased with PGY level. The MK and OC results had high correlations with PGY level, and ITE scores correlated moderately with PGY. The OC and MK results had a moderate correlation with ITE score. When residents were grouped by PGY level, there was no significant correlation between MK as assessed by the faculty and the ITE score. Conclusions Resident clinical performance and ITE scores both increase with resident PGY level, but ITE scores do not predict resident clinical performance compared with peers at their PGY level.


Sign in / Sign up

Export Citation Format

Share Document