scholarly journals Branding and Recruitment: A Primer for Residency Program Leadership

2018 ◽  
Vol 10 (3) ◽  
pp. 249-252 ◽  
Author(s):  
Eric Shappell ◽  
Nahzinine Shakeri ◽  
Abra Fant ◽  
Jeremy Branzetti ◽  
Michael Gisondi ◽  
...  
2020 ◽  
Vol 95 (9) ◽  
pp. 1428-1434
Author(s):  
Liselotte N. Dyrbye ◽  
Andrea N. Leep Hunderfund ◽  
Richard C. Winters ◽  
Susan M. Moeschler ◽  
Brianna E. Vaa Stelling ◽  
...  

2019 ◽  
Vol 19 (1) ◽  
Author(s):  
Amy R. Schwartz ◽  
Mark D. Siegel ◽  
Alfred Ian Lee

Abstract Background The Accreditation Council for Graduate Medical Education requires each residency program to have a Program Evaluation Committee (PEC) but does not specify how the PEC should be designed. We sought to develop a PEC that promotes resident leadership and provides actionable feedback. Methods Participants were residents and faculty in the Traditional Internal Medicine residency program at Yale School of Medicine (YSM). One resident and one faculty member facilitated a 1-h structured group discussion to obtain resident feedback on each rotation. PEC co-facilitators summarized the feedback in written form, then met with faculty Firm Chiefs overseeing each rotation and with residency program leadership to discuss feedback and generate action plans. This PEC process was implemented in all inpatient and outpatient rotations over a 4-year period. Upon conclusion of the second and fourth years of the PEC initiative, surveys were sent to faculty Firm Chiefs to assess their perceptions regarding the utility of the PEC format in comparison to other, more traditional forms of programmatic feedback. PEC residents and faculty were also surveyed about their experiences as PEC participants. Results The PEC process identified many common themes across inpatient and ambulatory rotations. Positives included a high caliber of teaching by faculty, highly diverse and educational patient care experiences, and a strong emphasis on interdisciplinary care. Areas for improvement included educational curricula on various rotations, interactions between medical and non-medical services, technological issues, and workflow problems. In survey assessments, PEC members viewed the PEC process as a rewarding mentorship experience that provided residents with an opportunity to engage in quality improvement and improve facilitation skills. Firm chiefs were more likely to review and make rotation changes in response to PEC feedback than to traditional written resident evaluations but preferred to receive both forms of feedback rather than either alone Conclusions The PEC process at YSM has transformed our program’s approach to feedback delivery by engaging residents in the feedback process and providing them with mentored quality improvement and leadership experiences while generating actionable feedback for program-wide change. This has led to PEC groups evaluating additional aspects of residency education.


2020 ◽  
Vol 20 (1) ◽  
Author(s):  
Alexis E. Pelletier-Bui ◽  
Caitlin Schrepel ◽  
Liza Smith ◽  
Xiao Chi Zhang ◽  
Adam Kellogg ◽  
...  

Abstract Background The objective of this study was to determine the advising and emergency medicine (EM) residency selection practices for special population applicant groups for whom traditional advice may not apply. Methods A survey was distributed on the Council of Residency Directors in EM and Clerkship Directors in EM Academy listservs. Multiple choice, Likert-type scale, and fill-in-the-blank questions addressed the average EM applicant and special population groups (osteopathic; international medical graduate (IMG); couples; at-risk; re-applicant; dual-accreditation applicant; and military). Percentages and 95% confidence intervals [CI] were calculated. Results One hundred four surveys were completed. Of respondents involved in the interview process, 2 or more standardized letters of evaluation (SLOEs) were recommended for osteopathic (90.1% [95% CI 84–96]), IMG (82.5% [73–92]), dual-accreditation (46% [19–73]), and average applicants (48.5% [39–58]). Recommendations for numbers of residency applications to submit were 21–30 (50.5% [40.7–60.3]) for the average applicant, 31–40 (41.6% [31.3–51.8]) for osteopathic, and > 50 (50.9% [37.5–64.4]) for IMG. For below-average Step 1 performance, 56.0% [46.3–65.7] were more likely to interview with an average Step 2 score. 88.1% [81.8–94.4] will consider matching an EM-EM couple. The majority were more likely to interview a military applicant with similar competitiveness to a traditional applicant. Respondents felt the best option for re-applicants was to pursue the Supplemental Offer and Acceptance Program (SOAP) for a preliminary residency position. Conclusion Advising and residency selection practices for special population applicants differ from those of traditional EM applicants. These data serve as an important foundation for advising these distinct applicant groups in ways that were previously only speculative. While respondents agree on many advising recommendations, outliers exist.


2012 ◽  
Vol 9 (4) ◽  
pp. 275-278 ◽  
Author(s):  
Grant R. Webber ◽  
Mark E. Mullins ◽  
Zhengjia Chen ◽  
Carolyn C. Meltzer

Author(s):  
Sarah L. Hilgenberg ◽  
Mary Pat Frintner ◽  
Rebecca L. Blankenburg ◽  
Hilary M. Haftel ◽  
Caren E. Gellin

2020 ◽  
Vol 12 (02) ◽  
pp. e234-e238
Author(s):  
Isdin Oke ◽  
Steven D. Ness ◽  
Jean E. Ramsey ◽  
Nicole H. Siegel ◽  
Crandall E. Peeler

Abstract Introduction Residency programs receive an institutional keyword report following the annual Ophthalmic Knowledge Assessment Program (OKAP) examination containing the raw number of incorrectly answered questions. Programs would benefit from a method to compare relative performance between subspecialty sections. We propose a technique of normalizing the keyword report to determine relative subspecialty strengths and weaknesses in trainee performance. Methods We retrospectively reviewed our institutional keyword reports from 2017 to 2019. We normalized the percentage of correctly answered questions for each postgraduate year (PGY) level by dividing the percent of correctly answered questions for each subspecialty by the percent correct across all subsections for that PGY level. We repeated this calculation for each PGY level in each subsection for each calendar year of analysis. Results There was a statistically significant difference in mean performance between the subspecialty sections (p = 0.038). We found above average performance in the Uveitis and Ocular Inflammation section (95% confidence interval [CI]: 1.02–1.18) and high variability of performance in the Clinical Optics section (95% CI: 0.76–1.34). Discussion The OKAP institutional keyword reports are extremely valuable for residency program self-evaluation. Performance normalized for PGY level and test year can reveal insightful trends into the relative strengths and weaknesses of trainee knowledge and guide data-driven curriculum improvement.


Sign in / Sign up

Export Citation Format

Share Document